Saturday, October 31, 2009

Don't Go In There!!! 10 Lessons From Our Favorite Horror Movies

Scary movies are fun. You're safe, after all. You may be snuggled on your sofa with the lights off, but the door is surely securely locked, and you know there's nothing hiding under the bed—and you've got enough ice cream to get you through whatever may come your way. If you're in a theater, well there's safety in numbers, isn't there? In scary movies, only foolish people who dare to wander about alone are ever targeted—and you would never go anywhere desolate alone, would you?

Scary movies can teach us important survival tips. (There's also a good, albeit typo-ridden, list here. But hey, in a horror movie no one really cares if you can spell. All that really matter is how fast you can run.) It's thanks to the careful studies of horror movies that we have emergency planning tips for surviving a zombie apocalypse. I, for one, am glad that these models have been mathematically tested and I suspect you probably are too—you wouldn't want to find out that your planning was for naught. It would kind of be like learning that the whole deal about garlic and vampires is a myth. Come to think of it, maybe some of the panic over H1N1 could be alleviated with this kind of planning and testing.

Well, being a social scientist, I had to add my own analysis to this growing body of literature. Here are 10 ethnographic lessons we can take away from our favorite horror movies:


(Special note: These are all rather old movies, but I do include some plot information, so proceed with caution. If you're concerned about spoilers, you may just want to read the bold text. Also given the huge number of remakes, its likely that you've seen a preview for an upcoming release based on a few of these, or you've seen them recently as a re-release.)

1. We all want to belong—it's not nice to make fun of others. In Carrie, the socially underdeveloped girl for whom the movie is named has a run-in with the resident school clique. The resulting anxiety causes Carrie to develop telekinetic abilities. The clique plots to get her to prom, and set her up with the clique leader's boyfriend. At prom, members of the clique manage to get Carrie elected prom queen. During her moment on stage, hardly daring to believe that she's been accepted by her classmates, the clique dumps a bucket of blood on her, and the crowd laughs. Big mistake. Carrie uses her telekinetic powers to punish the crowd in various ways. She finally seals the gym as an electrical fire breaks out, and returns home for a confrontation with her mother.

Carrie reminds us that high school can be a lonely time. It also marks the last stage of the "playground" where we learn socialization skills. No one wants to be excluded. No one wants to be the joke. People haven't yet fully realized their potential at this stage. So that geek in science class you made fun of but secretly copied off of during the midterms could go on to be successful as movie lore teaches us. So be nice. With this economy, you never know when you'll need a job.

2. The private domain, the household, is important to the proper socialization of individuals. The household is a very special place—it is where we learn the rules of our society, where we learn to navigate the social roles we will occupy as adults within the networks that make up social organization. The Halloween series shows us the dangers to the individual when the household fails to work as it should. [Image right: Abandoned house on Eastern Long Island.]

All six-year-old Michael Myers wanted was to go trick-or-treating. He got dressed in his clown costume and waited for his older sister. But she had other plans. Charged with watching Michael while her parents went to a Halloween party, she instead invited her boyfriend over to relieve the frustrations inherent to raging teen hormones.

Michael's sister did not act to preserve the private domain. Her responsibility was to ensure that Michael had access to a childhood rite—trick or treating—which would have exposed him to social interactions with others. The result was that a little boy felt as though he did not belong the to the larger social order—he became a ruthless killer. (Think back to all the times you tortured a younger sibling, and be glad that things turned out okay.)

Interestingly, in the Rob Zombie remake of this movie, Michael's home life is actually fractured: his father is dead, his mom works as a stripper, and her boyfriend makes lewd comments to Michael's older sister. The explicit broken home metaphor sends the same message as the subtler messages in the original: home is important to shaping the individual, and if the home (i.e., the private) does not function as it should, the individual will never be cemented with society.

FYI: Michael is such a successful killer that he spawned his own survival guide and a rather cultish following.

3. Sleep is important to your physical and emotional well-being. The benefits of sleep have long been recognized by scientists: Sleep allows your body to recharge. Without sufficient sleep, memory, mood, and judgment suffers—we may even be more susceptible to illnesses. In A Nightmare on Elm Street, a group of teenagers are terrorized by a the ghost of a child serial killer named Freddie Kruger. The first hint that Kruger means business occurs when Tina Gray dreams she is being stalked by a figure with razors on its fingers. She wakes just as the figure catches her, finding that she has cuts where the figure grabbed her in her dream. She learns that her friends all had similar dreams. It is ultimately revealed that Kruger was prematurely released from prison due to an administrative error, and an irate public murdered him. He is now stalking the children of the people who killed him.

The teenagers cannot go to sleep for sleep resides in the realm in which Kruger is the most powerful, and yet without sleep, the characters appear to become increasingly hysterical. Kruger plays upon our need for sleep. We have  to enter this vulnerable state. Nancy Thompson, the only remaining teen to escape Kruger's murderous onslaught, must reclaim the ability to sleep—and dream.

The Elm Street series blurs the line between the imaginary and reality. It reminds us that we need a balance of both to function as human beings.

4. Pay attention to erratic behavior— it generally signifies that something is very wrong. It's great to be passionate about something, but there is a fine line between passion and obsession. If a good friend of loved one begins to act obsessive or possession about a particular object, as Dennis did in Christine, get help for the person immediately.

Dennis is a fairly plain, but nice, teenager. He decides to use his money—against his parents wishes—to by a clunker. He lovingly restores the car, becoming increasingly possessive and jealous of anyone who expresses an interest in it. He loses all interest in anything but the car. He alienates his best friend and girlfriend, believing that they too want to take his beloved car away. This type of behavior signifies a break with society. Dennis needed help—though perhaps you also loved your first car and can sympathize.

5. Reserve judgment about people who spend extended amounts of time in costume. Why do we wear masks? To pretend to be someone else. Masks allow us to suspend responsibilities and bypass appropriate behaviors because when we are masked, we aren't ourselves. Clown makeup takes this further: because clown makeup is applied directly to the face, it transforms the individual instead of temporarily suspending the individual's own beliefs. With clown makeup, the individual becomes another entity entirely, not just an individual pretending to be something or someone else.


Clowns were historically comical fellows. They likely evolved from court jesters. As a class, they elicit laughter and even encourage it. But can any group accept being the basis of the joke all the time? Perhaps if the humor were innocent, but there are mean-spirited people who enjoy derogatory humor. This presents the opportunity for jokers to turn the tables. Pennywise the Clown from It pretty much sums up the fear quotient of clowns—watch what you laugh at around them lest they think you are laughing at them instead of with them.  [Image Right: Pennywise the Clown © Warner Bros. Television.]

6. It's important to play with others. I've written before about the importance of the playground and free play to a child's social development. Having an overbearing, jealous, and abusive mother who will not let you socialize with others can be detrimental to your well-being as Norman Bates of Psycho can tell you. Bates' mother particularly demonized women, making it impossible for Bates to have a normal relationship with someone of the opposite sex. He ultimately kills his mother, but having had limited to no contact with the outside world and having had her run his life, he is incapable of living by himself and resurrects her via a dissociative personality disorder so she could continue to guide his actions. He was so wracked by guilt about even talking to women that he would be driven to kill them. Again, this is why it is so important that we all get a chance to spend some time in the sandbox—we need to learn how to navigate social situations and essentially run our own lives.

7. One day, Mother Nature is going to fight back. Fido may be humankind's best friend, but even Fido has a limit for how many times he'll fetch the paper. Movies like The Birds and Jaws suggests that if animals and fish ever decided to launch an orchestrated attach on the human species, we'd be royally screwed. We've evolved bigger brains, sure, but we've lost our "animal instincts." For all of our fancy technology, if all the birds in the world decided they were going to methodically dive bomb populations, we might find ourselves quickly isolated under a self-imposed quarantine. Sharks are built to be predators. Humans? Without the trappings of modern weaponry? Not so much. Really.

8. We need variation. Jack Torrence, a writer, takes a job as a caretaker for The Overlook Hotel in The Shining. He believes the off season, when the hotel is closed and snowed in, will be the prime opportunity to work on his book. The snow settles and Jack has a bout of writer's block, begins to drink, and starts to see things. The hotel's past begins to manifest. He fights with his wife, and is encouraged by the ghost of a former caretaker to "correct" his family. He sets out to murder them.

While Jack had a good idea, and sometimes getting away can help a writer, we need the spontaneity that accompanies human contact. The same routine day-in and day-out dulls the senses. You know, it's kind of like that job where you sat and stared at your computer screen blankly until 5 p.m. 


9. It's important to know your local history. A quotation carved into a pillar at the National Archives in Washington, DC reads, "The past is present."  With each moment, we move inexorably forward. But that doesn't mean that all that has happened has magically disappeared. We leave material traces of ourselves in the world (see A Tale of Progress for the Sake of Progress for further discussion on this point). For example, just recently a gravestone from 1799 was found in Washington Square Park, which was once a potter's field.

According to one story concerning a family from Amityville, the DeFeos might have been better off if they had taken some time to learn about the land their house stood on. Ronald DeFeo Jr. was convicted of murdering his entire family in their beds because voices commanded that he do so. One explanation for the voices DeFeo heard comes from the allegation that the house was built on an Indian burial ground. In The Amityville Horror, which is supposed to be based on the true story of the DeFeo murders, if the Lutzes, who purchased the DeFeo's home after their untimely deaths, had done some more digging they would have thought twice about the bargain price they they were getting for the house. When paranormal events drive the Lutzes from their home, they are forced to abandon everything. See, the past can come back to haunt you. [Image Left: A babbling brook—legend has it that evil spirits will not cross running water.]

FYI: Since the Amityville story is partly based on real events (though there is controversy surrounding the entire affair), there are many places on the web with information about the house and the paranormal activity that was supposed to have occurred there. This site gives an interesting overview if you want to learn more.

10. There are truths hidden in the strangest places. In The Gate, the lyrics to a rock song are actually the words to a spell that opens a portal to another world. The past survives in unusual ways, and sometimes history can be appropriated without fully understanding its context. But this is perhaps the most important "lesson" from this genre that anthropologists can take away: there are interesting histories still waiting to be uncovered—you have to open your eyes and be willing to think outside of the box. We are an inventive species. We recycle ideas (the amazing number of scary movie remakes I uncovered writing this post attests to this). And each time we reframe something we give it new meaning and life, creating another means of viewing our society. Folktales can be particularly enlightening in this regard.

Well, there you have it. I hope you've enjoyed this week's look at Halloween. In the coming weeks, we'll return to daily life. See you soon—I heard a noise I should probably check out.

Do you have a "lesson" you'd like to share? Or know of another "classic" scary movie that would help explain any of my lessons? Share below.


Technorati Tags: Halloween, 10 scary movies, horror, Michael Myers, The Shining, Carrie, It, Nightmare on Elm Street, The Gate, The Birds, Jaws, 10 lessons from scary movies, Pennywise, Freddie Kruger. Christine (movie), Psycho, Norman Bates, Amityville

Friday, October 30, 2009

Did You Spit on That Bat? Rituals and Superstitions in Baseball

Update: The opening day Citi Field video appears to have been removed. I have replaced the video with a link, and included a photo in the event that video also gets pulled by MLB.

The crack of the bat, the smell of sausage and peppers, and the vendor's cries of "Beer here!" and "Pretzels!" will soon be nothing more than distant echoes—the ghosts of a season past. American baseball fans are saying their final farewell to summer with the Fall Classic. Just as spring training marks the coming of long, lazy summer days eating hot dogs and pretzels, yelling at the umpires, and singing "Take Me Out to the Ballgame," October ball means hot chocolate and gloves, jackets, and blankets (though there is still considerable yelling at the umpires and booing the opposing team). At the end of the Classic, once a champion is declared, the non-contenders lick their wounds and turn their thoughts to the next year. The fans begin to dream of spring. Until that point, however, baseball is littered with small rituals and superstitions meant to appease whatever gods are watching, and more importantly, the fans. If you do happen to tune in to watch this year's contenders, pay attention to the guy on the mound. If he's pitching a good game, take note of whether any of his teammates talk to him. And try and see if anyone steps on the foul line. And when the camera pans to the dugout, note whether the guys seem to sit in the same place. In the stands, check out whether the fans have signs and how they're wearing their hats when their team is down.

Baseball may be one of the few remaining bastions where superstitious beliefs and behaviors abound—and are actually encouraged. David Wright and his teammates on the New York Mets shaved their heads to break a slump, believing that a change and a show of solidarity would change the way their bats behaved. Wade Boggs ate chicken before every game. John Smoltz did jumping jacks for almost half an hour once to keep a rally going for the Braves. And then there's the equipment: Players should talk to their bat, sleep with it, and never, ever loan it to another player. And they should also remember to spit in their hand before picking it up. Baseball players know that missing any one of these things could mean that the fate of the game—and the wrath of at least 40,000 screaming fans—could be on their shoulders. And that's only the beginning. There are tons of superstitious baseball beliefs like these.

Whole teams can be cursed, like the White Sox. Though the most famous curse probably relates to one of the biggest rivalries in the sport—the Red Sox and the Yankees. The Curse of the Bambino was invoked when the Red Sox sold famed pitcher and right fielder, Babe Ruth, to the Yankees in 1919. Up until that point, the Red Sox had been fierce contenders in the sport—they'd won five World Series championships, including the first title in 1903. After selling Ruth, it would be 86 years before they'd win another. In 1920, during his first year with the Yankees, Ruth hit 59 home runs—more than any team that season. He led the Yankees to seven AL pennants, and four World Series championships. From 1920 to 2003, the Yankees would win 39 AL pennants, and 26 World Series titles. The Red Sox struggled. The championship always seemed to be just beyond their reach, and a vicious rivalry grew between the two teams. They tried in vain to lift the curse, which was finally broken with the help of a bloody sock—but not before Bill Buckner let a ball through his legs in 1986 to name the Mets the World Series champs. In 2004, the Yankees and the Red Sox faced off in October. The Yankees won the first three games of the World Series. Then, in a startling upset, the Sox managed to win the next four games straight, thanks in part to Curt Schilling's willingness to pitch through an injury. [Image Above: Bill Buckner 1986 © Sports Illustrated.]

The Sox got off lucky—the Cubs are still trying to shake the Curse of the Billy Goat. And recent seasons (and really, some not so recent seasons) have Mets fans asking whether the team is cursed too. Is it the Curse of Doc Gooden? One fan has gone so far as to propose the Curse of the Cat, claiming the new Citi Field has been cursed by the cat that ran across the field during the opening game at the new ballpark (see video here). When the new Yankee stadium was being constructed, Red Sox sympathizers tried to curse on the Yankees by burying a Red Sox jersey in the foundation. Despite initially laughing at the claim, Yankee officials ordered workers to break through two feet of cement to remove the offensive garment. [Image Right: Cat streaks across field during opening day at Citi Field. Credit: Mary Altaffer/AP.]

What causes otherwise rational people to succumb to superstitions in sporting situations?

Superstitions are found in every culture. We often first encounter them as children. They are the first set of "rules" about our world that we learn—an attempt perhaps to try and exert some control over our lives. For example, these are two that you are probably familiar with:
You must hold your breath while going past a cemetery or you will breathe in the spirit of someone who has recently died.
Don't step on a crack on a sidewalk or walkway. (Step on a crack/break your mother's back.)
As children our lives are mostly managed by our caregivers, so there's a small sense  of satisfaction in knowing that because you held your breath as you passed a cemetery, you—as small as you are—were able to thwart an evil spirit or prevent an injury to your mother.

As adults, we're faced with the realization (and frustration) that there still things that are beyond our control. We want to know. We need explanations. While modern technology and science have provided many answers, superstitious beliefs still persist because they offer a sense of control—and a connection. While you may not believe breaking a mirror will bring you seven years bad luck, you're unlikely to go out of your way to smash mirrors intentionally—or walk under a ladder, for that matter. You believe that in not breaking a mirror, you've done nothing to draw bad luck and so any misfortune that comes your way, can't be your fault. And if your neighbor believes he shouldn't break a mirror, then we're back to the herd mentality—the idea that we can be protected if enough people follow suit. A belief in superstitions connects us to others. If we all believe (or kinda believe) that breaking a mirror brings bad luck, then we're all less likely to break mirrors. We are consciously careful with this fragile glass. Over time, the superstition may fade, but not before a pattern of behavior emerges (i.e., don't break mirrors because the shard could cut someone).

With 40,000+ people watching anxiously as batters try to hit a tiny, fast moving ball if they don't exert some sort of control over the situation, there'll be a backlash—no matter how good they are. So they tap the plate, draw a line in the batter's box, fix their gloves, etc. They're showing their fans that they're ready—it's not their fault if they miss.  Their decisions are under the collective scrutiny of fans who view them as their representatives in the competition. NY Mets fans have not quite forgiven Carlos Beltran for not swinging the bat, which ended the Mets' run in 2006. These superstitious rituals give the players a sense of control over the unknown, but it also protects them. [Image Left: Carlos Delgado, NYM, gets ready to bat.]

And it connects them to the fans. In every national sport, the athletes are an extension of the populace.
An athlete who represented his people and won, conferred a distinction upon the people and the city as his victory testified to the caliber of the citizens; accordingly, upon his return, they would break down a part of the wall for him to enter through because “a city which could produce such citizens had no need of walls to defend it” (James 1963: 154).
We are linked by a belief in them—no matter how poorly they perform, we have chosen them as our representatives. And our alliance to a team allow us to distinguish between each other as Mets fans and Yankee fans and Philly fans, and White Sox fans and Cubs fans can all tell you. It grants us an identity—it confers respectability. The Tomahawk Chop, and singing Sweet Caroline and Lazy Mary are rituals that unite us. They also confer an expectation that the team will win. After all, the fans are wearing their protective charms—jerseys, hats, and even sneakers—they carry signs, and bring brooms when necessary. All the players have to do is spit on the bat, tap the plate, and remember to swing. [Image Left: Crowd at Shea 2008.]

Seventh inning stretch at Shea Stadium, 2008. Fans sing "Take Me Out to the Ballgame" and "Lazy Maria."
Losses hurt that much more because of charms and superstitions.. Superstitious practices are meant to shield against these while passing the loss onto circumstances beyond anyone's control. If a team loses even after the rally caps have been put on, then it isn't for want of trying. To enter a ballpark is to believe that belief is enough. NY Mets fans know this—it's their motto: "Ya gotta believe." The ballpark remains the last fortress of sheer belief, where the energy of the screaming masses can guide a bat true and steady. Maybe there is a little magic left in the world after all. You can only catch glimpses of it from April to October, and for some it is brighter than for others but it's there—wherever there are fans, and pitchers, batters, fielders, and catchers to perform the rituals that defy logic. When the game is on the line, the power of the crowd encompasses the staunchest critic. We're one force, and its our respectability on the line. And superstitions don't seem quite so silly. [Image Bottom Right: NY Mets Fans with signs at Shea Stadium, 2008.]

Do you throw salt over your shoulder? Knock on wood? Wear your team colors on your socks when they're playoff contenders? Let us know below!

Cited:
James, C.L.R.
1963 Beyond a Boundary. New York: Pantheon Books.

Wednesday, October 28, 2009

The Spirit of America—Haunted and Hallowed

When describing American society, the traditional image has been that of a melting pot. The idea being that immigrants to the United States assimilated, leaving behind their traditions and values. For an emerging nation seeking to establish itself the politics of sameness may have been a unifying force. But while early immigrants may have been unable to import tools to support traditions and cultural practices, the rise of globalism eased this challenge and made it easier for immigrants to remember their histories. Increasingly today, to be American does not imply a standard set of traditions and values—there is room to remember your own roots. We are a richer culture for it. For this reason, the metaphor describing Americans has shifted to favor cultural diversity, proposing an image of a tossed salad or salad bowl: the different ingredients in the salad each retain their individual properties but are united in the salad bowl. My trouble with this image is that it doesn't account for the common element that binds people together in the formation of a nation. If we are a nation of salad ingredients, are we held together by dressing? The bowl?

There are certain traditions and practices that seem to reach through culture. They are shaped by multiple histories and adopted by most people, making them social markers. Halloween is one such observance. Its history is rooted in the past. As an observance, it has been touched and shaped by major political and social forces through the ages. Today it is celebrated around the world in some form, providing a unifying basis for many people. In America, the present form of Halloween suggests that perhaps we should consider describing our nation as a patchwork quilt. We are different elements, even unique elements, sewn together by an underlying belief in freedom. That we can all share in Halloween festivities regardless of our ancestry suggests that festivals that are flexible can not only survive, but bind people together.

[Image: A local graveyard installed on a neighbor's lawn.]

Halloween traces its roots to the Gaelic festival of Samhain, which was marked on November 1st by the Celts. It was a harvest festival—timed to prepare for the coming cold. It was a moment of change, and during this transitory period the Celts believed the border between this world and the next was thinnest: Spirits could pass easily into this world to walk the earth. On Samhain night, the Celts wore animal skins to confuse the spirits and to avoid their tricks. It was also thought that because the spirit world was so close that this was an optimal night for divination. To thank the earth for the harvest, the Celts offered blood sacrifices, such as livestock, and Druids read the entrails of the animals to foresee what lay ahead—who would die, who would marry, etc. The predictions provided entertainment for the evening.

While Samhain was being observed by the Celts, the Romans were worshiping the goddess Pomona in a similar harvest festival. The offered her fruits, notably apples—the basis for bobbing for apples. When the Romans conquered the lands that the Celts occupied, the two traditions merged. However, the rise of Christianity did not permit the worship of nature spirits. A concerted effort began to erase Samhain, which stood as a symbol of paganism. The conversion of Constantine helped this cause, but missionaries found it difficult to carry out his orders, so they devised a plan to modify paganism to mask Christian beliefs. In the 8th century, Pope Gregory III turned Samhain into All Saints Day—a day to honor the saints who did not have a special day already. The eve of All Saints Day, All Hallows Day, became the day in which the practices of Samhain were observed. The "evening of Hallows" ultimately became Halloween. [Image: A modern day ghost.]

People continued to dress in animal skins, put out offerings for spirits, and mark Samhain in the name of All Hallows. In the 10th century, November 2 was pronounced All Souls Day. It was a day to honor all dead—the church's attempt to completely replace Samhain with a church sanctioned holiday. The resilience of what we call Halloween suggests it provides us, as human beings, with something we are reluctant to relinquish. Halloween permits us to believe in a spiritual world that we can reach without the intervention of a empowered other, such as a priest. In fact, it was on these grounds that the church ultimately splintered: In 1517, Martin Luther would nail The Ninety-Five Theses to a church door, discrediting the practice of indulgences, and bringing about the Protestant Reformation. Luther sought to bring people closer to god, without the help of priests or saints. All Saints Day was cast aside, and Halloween become a symbol of protest.

Halloween in the American colonies was a scant affair. The Puritans who settled here found the holiday too pagan and too Catholic. It was the arrival of the Irish in America that helped Halloween blossom:
  • The Irish originally carved turnips to frighten away "Stingy Jack," who made an ill-fated deal with the devil and was forced to wander the world with a lump of coal in a turnip to light his way, which people called "jack-o-lanterns." Turnips were harder to come by in America, but pumpkins were available and easy to carve—and the name stuck.
  • As mentioned, Halloween night was a time when spirits would walk about and play tricks on people. Ireland is littered with fairy mounds, and the Irish believed that fairies would emerge from their mounds on this night. The practice of costuming and the tradition of tricks on Halloween night is meant confuse these spirits.
  • Halloween night is an optimal time for divination. There are several Halloween games that allow you to predict your romantic future which are played today. Similarly, the practice of telling these predictions gives us the precursor for telling ghost stories on this haunted night.
  • The word witch comes from an English word, wicca, meaning wise one. The embodiment of the witch as evil was so powerful that animals associated with witches also became powerful—cats, which are nocturnal, were thought to be the wandering spirit of the witch. Today, black cats are a popular decoration item.
  • Bonfires were a big part of the Samhain celebrations. But bonfires would attract mosquitoes, which in turn would attract bats, which also became a popular Halloween decoration.
We have woven these elements together to create our own particular brand of Halloween. People add their own touches to the holiday all the time. In this way, Halloween can belong to all of us and still hold its mystique. Its survival despite the church's attempts to sanitize and edit it as a festival and observance is something that many immigrant groups can relate to—whether they do so consciously or not. And in some ways, its survival represents the American ideal: that you can come here, from anywhere, and live free of persecution and be accepted. This is an idea that many Americans hold to be sacred—though not many may know that a supposed spooky tradition embodies this ideal.

What are your favorite Halloween traditions? Share them below.

Tuesday, October 27, 2009

A Tale of Progress for the Sake of Progress

In the History Channel series, Life After People, we get a glimpse at what the future of our world could potentially look like after Homo sapiens are gone: cities will crumble, nature will reclaim space that has long been cleared and developed, and animals will live in the markers of modernity that we hold so dear. Imagine a world where the Statue of Liberty is rubble, where there are nothing but stones to signify where your home once stood, and skyscrapers are overgrown with ivy and other greenery. A haunting and fitting image for this Halloween season, wouldn't you agree? What will our remains tell of us? And more importantly, will there be remains to tell a story at all?

In 1986, a team of archaeologists uncovered a cluster of Native American structures from the 14th century in Narragansett, RI. The 25 acre site is an amazing find—the remains of numerous buildings, pottery, tools, a burial ground, as well as 20 circular pits used for storing corn have all been found and offers an extensive look into life in a seaside Indian village:
“It’s just totally remarkable. It’s like suddenly being able to see,” said Paul A. Robinson, principal archaeologist for the Rhode Island Historic Heritage and Preservation Commission. “This allows us to walk through a coastal village and begin to see how it was laid out, the way the houses relate to each other, the different kinds of structures.” 
Landscapes can be particularly revealing about daily life as I have discussed previously. Only one other seaside Indian village site of this caliber is known to exist: Werowocomoco in Virginia, which is supposed to have been the home of Pocahontas. The degree of preservation at the site is rare, and if left to researchers, they can learn something about a history that has largely been lost to time and development. According to the article, New England was either plowed or developed as "progress" moved the land inexorably forward, which removed much of the record of life before the colonies.

However, the future of this site remains uncertain. It was purchased by Churchill and Banks Companies, LLC (formerly the Downing Corporation) and earmarked for the development of about 80 private residences. Per state regulations, the company had hired an archaeological team to conduct a survey for artifacts before building began. Clearly, they didn't expect anything like this, and in fact, they've constructed about 26 houses on the eastern part of the site which was cleared by archaeologists. The problem is that the remaining part of the land is stuck in limbo. The original idea was that if artifacts were found, they would be moved to an appropriate place, and the company could commence with development; however, given the nature of the finds at the site, the Historic Heritage and Preservation Commission requested that the Coastal Resources Management Council (CRMC) revoke the permit they had issued to Churchill and Banks. CRMC said the permit would need to be re-evaluated, and when Churchill and Banks tried to build on the site in June 2007, they were given a cease-and-desist order, prompting a 10 million dollar lawsuit on their part. They claim that the rules are being changed midway, and don't view the state's offer to purchase the site for approximately $2 million dollars as fair compensation for the time they have invested and lost in this project.

Archaeologists fight these types of battles often. Fortunately, I have an inside line into the world of archaeology—my good friend Meg Kassabaum (see image at right), a Southeastern US archaeologist, faced a similar situation at the Feltus site in Natchez, Mississippi. Feltus was a threatened site with the potential to provide important insights about Indian life in a period for which there is limited evidence. The site has three large mounds on it. One of which is a burial mound, so it is possible that Feltus was a "sacred" location much in the way the Narragansett regard the location in the Rhode Island. In an email to me, Meg wrote,
The modern fiasco of preserving the site is both more simple and more complicated than the case described in [the other] article. Perhaps most importantly, however, it is more resolved!  
As luck would have it, Feltus is located on the land of people who are interested in preserving the past and have no intention of selling their land anytime soon, which means that the site will be available for research for years to come. However, while good landowners are integral to the success of archaeology, sometimes collegial relationships aren't enough. Meg reports that in the United States, it is possible for the owner of a piece of land to sell the rights to the underground minerals but retain possession and control of the surface. It turns out that many years ago, the Feltus landowners had sold off the mineral rights to the site. One day Meg and her team showed up at the site to find an oil company surveying the land for a place to put their newest oil well, and were told there was nothing they (nor the landowners) could do to prevent the construction.

Like the Narragansett site, cultural resource organizations mobilized quickly: the historic preservation community in the nearby town of Natchez and the state capitol of Jackson sprang into action—and through the hard work of many people, with "impassioned pleas and logical arguments,"  the oil and gas permitting board was convinced to deny the company its permit for that location, saving the mounds and other archaeological materials at Feltus for further excavation. In Meg's words,
The oil well ended up being built just down the road and we witnessed the destruction of an additional, much smaller archaeological site during its construction.  Perhaps the scariest part of this all is how quickly this all could have happened and how much of the site could have been destroyed if we had not HAPPENED to be working at the site at the time the oil company first showed up.
Rhode Island requires that sites be surveyed prior for archaeological artifacts prior to development, but not all states have these types of regulations—and even in places that do, these laws are often bypassed on the grounds of "budgetary concerns." The Feltus story shows us how easily historical artifacts may be lost, while the Narragansett site illustrates why development companies would want to bypass presearvation laws. Churchill and Banks believe that they have followed the rules and laws as required, and yet there is no agreeable resolution in sight that takes into account their losses. 

Without advocacy—and sometimes even with a voice—historical sites can be lost to progress. Even with the economic downturn, New York City has been "in-development." Old construction projects have been finished, and new ones started (though there are admittedly fewer than there once were). The redevelopment of Pearl Street in the Financial District of New York City provides an interesting parallel. Pearl Street dates back to the New Amsterdam colony. It's seen the rise of trade and commerce in the New World, and so when redevelopment began on the site, there was a bit of uproar from interested parties. The Rockrose Development Corp bought the properties on the block with the intent of putting up hotels and parking garages and other such markers of modernity. And they did. The block has been completely redeveloped, with the exception of 211 Pearl Street whose facade is still standing, partly because it was originally home to Colgate (yes, that Colgate) and partly because a strange brick triangle was found in the walls. The facade now masks a parking garage, and with future development planned for this site, it will likely look a bit out of place—but that may be better than having lost it entirely, no?

I find it interesting that the voices of protest are most often the researchers. Of course, most historic sites don't have their original owners to speak for them, but in Narragansett, this isn't quite the case. The article doesn't permit the Narragansett tribe to speak on the issue beyond a mention that they side with the state, which is odd since according to the piece they've long regarded the site as sacred. The Narrangansett culture has been around for centuries and they are likely the first people to have lived in the Rhode Island region. it seems that if anyone should be vocal about the findings, and have a real say in the future of the site, it would be these people.

We are constantly building bigger buildings, re-building sites, repurposing our land. And we have to—space is at a premium! But do we really need another hotel? Or another housing development where the houses are virtually indistinguishable from one another? At what point do we draw the line between progress and progress for the sake of progress? And who should be the custodian of historical sites—the state, a research body?

The time will certainly come when we are no longer here. Will there be a record of our past, or only of our progress?


Source and acknowledgments:
Thanks to Meg Kassabaum for chiming in on archaeological matters.
Special thanks to Barry Bainton who forwarded the link to this article:
Edgar, Randal. 
2009     "Dig Uncovers Significant Historical Site in Narragansett."  The Providence Journal. October 19.

Monday, October 26, 2009

Parables in Peculiar Places

Jerry was driving home late one night when he saw a young lady waiting by a bus stop. He stopped his car and told her that he didn't think the buses were running so late at night and offered her a ride. The fall night air was getting chilly, so he took off his jacket and gave it to her. Although his passenger wasn't much for conversation, he managed to learn that the girl's name was Mary and she was on her way home.

After driving for an hour, they arrived at her home. Jerry said goodnight, she went in the front door, and went home himself.

The next day he remembered that Mary still had his jacket.

He drove to her house and knocked on the door, an old woman answered. John told her about the ride he had given her daughter Mary, and explained he had come back to get the jacket he had lent her. The old woman looked very confused.

John noticed a picture of Mary on the fireplace mantel. He pointed to it and told the old woman that that was the girl he had given a ride to. With her voice shaking, the old woman told Jerry that her daughter had been dead for many years and was buried in a cemetery about an hour away.

Jerry ran to his car and drove to the cemetery. He found his jacket, neatly folded on top of a grave. The name on the gravestone was ... Mary. (Late Night Ride)

[Note: I made minor edits to this story for readability. I did not alter content or tone.]
It's the spookiest time of the year! So I thought we could have some fun this week with some Halloween themed posts. The story above is one you have no doubt heard or read at some point—perhaps around a campfire or at a Halloween party. The ghostly hitchhiker is a popular American folktale. While many variations exist, the stories tend to be rather formulaic. A motorist (usually a man), picks up a lonely hitchhiker (generally a young female), and transports her to her destination where she either vanishes or enters a house. If she enters a house, the driver usually has reason to follow her—e.g., he wants the jacket he loaned her, wants to return something she left in the car, or sometimes, if the destination is a house and she vanishes when the car arrives at the destination, he wants an explanation as to why she disappeared. When he knocks on the door, he learns from a grieving loved one that the passenger died (many years ago). If he's trying to reclaim and item, it can usually be found at her grave site. Two famous paranormal American hitchhikers are Resurrection Mary and the Greensboro Hitchhiker—both feature young women in white looking for a ride home after a night at a dance, and both women ask to be dropped off at cemeteries. (Cue spooky music.)

The age of these stories is unknown, but they have existed in the United States since the days when we traveled by wagons, and possibly even earlier than that in other places: For example, in the Bible, the Apostle Philip hitches a ride with an Ethiopian, whom he baptizes before he vanishes. There are roads throughout the world purported to host a ghostly traveler looking for a ride back to loved ones or to their final resting place. Given that this particular story is found in different variations throughout the world perhaps it signifies that there is more to these phantom hitchhikers than a simple scare. Could the stories be parables—perhaps ones that encourage altruism?

In the story above, a man stops for a young woman waiting at a bus stop and thinking of the late hour and the difficulty she may have in getting home, offers her a ride. There is an innocence here that is far removed from the mentality of our current society. I don't know many women today who would willingly rely on the assistance of a strange man even if it meant they would be waiting for an hour for the bus, One of the basic rules for survival that we learn as children is that you never, ever, ever get into a car with a stranger. And yet, time after time, these women do just that, and the men seem to have no ill intentions. They simply want to assure that the young woman gets home safely.  There is a basic concern for a fellow human being exhibited here—perhaps linked to the more distant times when relying on your neighbor was common and necessary for survival. The driver comes to no harm by helping the phantom. In fact, he has has helped her spirit rest.

The story is also plausible because hitchhiking was a fairly common practice until recently—in the 1950s and 1960s, and even into the 70s, hitchhiking was a common means of making your way around the country. This simple formula of this story also means that it's highly customizable. Jerry can easily become "my friend Harry" or "when my dad was in college coming home for the weekend," etc. Similarly, the address can be in any locality, it can be any old graveyard, the hitchhiker can belong any couple known to have lost a daughter to a car accident. During this period, there was also rise in drag racing, resulting in an increase in teen deaths overall. Songs like "Last Kiss" were based on these types of events. You may have heard the version cut by Pearl Jam which was a hit for the group. Here is the original:


There seems to be two lessons to be taken from these types of ghosts stories then. First, there is a reminder of the importance of community—that you could depend on a stranger for help if you were in need. Second, there is also a reminder to the audience about the dangers of driving too fast: you too could be doomed like Mary. (Perhaps this was also an attempt to empower girls to take a stand against dangerous boyfriends who drove too fast, particularly under the influence of a few beers.) The drivers who pick up these passengers are usually male. The phantom passengers are a shadowy fate for the driver if they aren't careful. After all, they could survive, but their passengers—their girlfriends and other loved ones—may not. These stories aren't necessarily "spine tingling," but they impart messages about society to each generation of listeners. They try and impose patterns of behavior.

In accordance with changing times, a darker variation began to emerge in the late 1970s, becoming popular and widespread in the 1980s:
One summer day, a woman pulled into a gas station. As the attendant pumped the gas, the woman told the attendant that she was in a hurry to pick up her daughter who would soon be finishing her art class.

While she was waiting, a man walked over to her car. He explained to her that his rental car had died and he needed a ride to an appointment. The location was just down the road from her daughters art class, so she told him she would be happy to give him a ride.

He put his briefcase in the backseat, and said that he was going to the men's room quickly. A few minutes passed, and the woman looked at her watch. Realizing that she would be late, she drove off quickly, forgetting that the man would be coming back.

She thought nothing of it until she and her daughter pulled into their driveway. She saw his briefcase and realized she had forgotten him! She opened the briefcase, looking for some sort of identification. All she found inside was a knife and a roll of duct tape.
This story is full of warnings and fits with the messages we learn as children about the dangers of trusting strangers. This version has grown more complicated and detailed with time. It amplifies the stranger, sending a clear warning about the dangers of trusting too easily. It demonstrates a shift in societal thinking, which in turn reflects larger events of the time. The early 1980s was a period of tremendous change: John Lennon was assassinated, Reagan was elected, and the US invaded Grenada in the battle against communism. The world was changing rapidly—and seemingly not for the better. Suddenly, you needed to be wary of your neighbors, and you needed to watch out for yourself because, as the media would later proclaim loudly, self-interest was on the rise and it could be deadly.

These stories can provide insights about the social organization of the time. In addition to providing a slight chill, they are also apt teaching tools—people of all ages are more likely to pay attention if the story seems familiar and if its somewhat tragic or frightening. There are scary stories in many cultures reserved just for frightening children to insure they behave. Does the boogeyman ring a bell? There is a lot that can be learned from folklore.

Do you have a spooky story to share? How would you alter these tales for a retelling today: Will the hitchhiker wear earbuds? Will the driver be on his way to meet a phantom he talked to online? is the murderer in the backseat a cyber stalker? Share your ideas below!

Technorati Tags: Halloween, ghostly hitchhiker,paranormal,ghost stories

Thursday, October 22, 2009

Paradigms of Sociality in a Digital World

In today’s digital world, we increasingly rely on social media to manage and maintain social connectivity. From Facebook to Twitter, we’re all connected and used to sharing—or over-sharing—details of our daily lives. Having dinner? Angry with someone or about your commute? At the gym? Witness something strange on the street? Having a miscarriage? People are using technology to share everything—and sometimes there are consequences to posting. People are also spending more time in digital worlds, such as Second Life, and more than one relationship has ended because a partner has been unable to disconnect or is guilty of a virtual affair.

We are more connected than ever, but does this extend beyond the digital arena? It seems that while we are willing to share the most intimate details of our lives online, we want to minimize our contact with others in the physical world. For example, text messaging is one means of avoiding conversations with others. (Know anyone who’s been dumped by a text message?) Are we moving toward a paradigm of public social avoidance with the rise of social media?

We actually take physical steps to reduce contact with each other. On my morning commute, the majority of riders have reduced opportunities for social contact through the use of iPods. I have blogged previously about how these devices can intrude upon one’s personal space in the public arena, but they serve a purpose: with earbuds firmly in place, the iPod user can “tune” out distractions—and each other: 
The immense storage capacity of iPod and its imitators offers at least the opportunity for total, uninterrupted isolation from one's surroundings for long -- extremely long -- periods of time. It is now possible to commute, to stroll, to shop, even to go to a Knicks game, without having to listen to another human being, or even the same song. There is no rewinding or CD-changing to permit the outside world to leak inside the cocoon. With a jukebox in your pocket, a suitable tune is always at the ready, no matter your mood. And if you have little white ear buds rammed in your ears, there is always an excuse not to acknowledge fellow humans. ''I'm busy right now,'' iPod users seem to say. ''I'll get back to you in 10,000 songs.'' 

I admit to being a "reading commuter"—a good book occupies the time it takes for me to get to and from work. Or so I tell myself. In truth, being engrossed in a good story means that I have a polite out in the event any of my fellow commuters decide to strike up a conversation—I can keep it short and polite and pointedly go back to my book. And it also means that I can ignore some of the more “colorful” characters who may cross my commuting path. But why do I engage in this means of social avoidance? After all, I am a prolific Facebook user and I have posted my share of “I’m making dinner” statuses—and given the size of my network, I’ve likely shared this information with people who could care less. But in public, like my fellow commuters, I’m not particularly open to sharing non-personal details.

Social avoidance is not a particularly new phenomenon. It is most famously been linked to the Kitty Genovese murder that took place in the 1960s in Kew Gardens, NY. Kitty’s screams for help went unacknowledged—none of her neighbors phoned the police despite potential evidence that they heard her cries. Social psychologists have studied this case extensively and attribute the inaction of Kitty’s neighbors to two possible reasons: pluralistic ignorance and diffusion of responsibility. The former refers to the private rejection of a norm by a group that remains inactive because they believe that the majority support this norm. The latter refers to inaction due to the belief that someone else will act—responsibility is not explicitly assigned to any single individual. Both reasons are linked to social avoidance.

The link between iPod users, reading commuters, and Kitty's neighbor's lies in the extent to which our social ability to “see” is hampered by these tools and strategies. Public safety officers have long encouraged runners and others to be aware of their surroundings for their own personal safety. The idea being that while you may be grooving out to Queen, Blink 182, or Fall Out Boy during your morning run, you could miss the sound of footsteps coming up behind you, or if you’re holding your breath as you travel with Stephen King through the macabre, you could may find that your wallet or Metrocard has abandoned you at some point during the journey. But beyond personal safety, are we losing our ability to see others. Remember our discussion about the invisible homeless? If we lose our ability our see each other, how does that affect our ability to read social biofeedback? Does this compromise our theory of mind—our ability to mentalize, to attribute beliefs and attributes to others? Theory of mind is integral to social life—it allows us to navigate and manage our social relationships because it allows us to read social biofeedback to understand others. But social interaction is important to the development of theory of mind. If we willingly limit opportunities for social interaction, what are the implications? What are your thoughts on why we would seek to limit physical interactions in the first place?

Follow up: The NYT City Room blog posted reader responses to the issue of cell phone abusers.


Technorati Tags: iPod, commuting, social avoidance, Kitty Genovese, sociality, Second Life, World of warcraft, invisibility, theory of mind, text messages, Facebook, Twitter

Tuesday, October 20, 2009

Much Ado About the (Swine) Flu?

Have you gotten your seasonal flu shot?

Unless you've been living in some extremely remote region, completely cut off from the world (and then let's face it, you wouldn't be reading my post), you've likely heard of the dreaded H1N1 (swine) flu virus that's been making the rounds since the last flu season. Well, Stephen King predicted this would happen, so we should have been prepared! Seriously, go read the original version of the The Stand, and tell me that we aren't facing similar circumstances—well, okay, humankind hasn't been virtually eliminated except for two polarized factions aligned with opposing divine forces (at least not yet), but the part about a mutating flu virus and the resulting panic seems appropriate in this context. But all joking aside, people are definitely frightened, and its understandable to a certain degree.

Why are they frightened though? As a species, we've faced this before—something like it anyway—and (most of us) survived to tell the tale. It's a shame students don't learn more about past pandemics in history class—because pandemics have happened before. From 1918 to 1920, a version of the H1N1 virus swept the globe, called the Spanish Flu, killing anywhere from 50 to 100 million people. Approximately 1/3 of the world's population was infected. In the United States, 28% of Americans were infected, and approximately 675,000 people died. And we've faced—and spread—other viral outbreaks. Native American populations, as well as other Amerindian groups, were decimated by smallpox and other communicable diseases. And before contact with the new world, there was the Black Death of the 14th century, which caused approximately 60% of Europe's population to die painfully and grotesquely. And there have been subsequent outbreaks throughout the globe well into the 20th century. These outbreaks have been traced well into our history. For example, according to Barquet and Domingo (1997),
Smallpox is believed to have appeared at the time of the first agricultural settlements in northeastern Africa, around 10 000 BC. It probably spread from Africa to India by means of Egyptian merchants in the last millennium BC. The earliest evidence of skin lesions resembling those of smallpox is found on the faces of mummies from the time of the 18th and 20th Egyptian Dynasties (1570 to 1085 BC) and in the well-preserved mummy of Ramses V, who died as a young man in 1157 BC.
The bottom line is that viruses have been around a long time. An outbreak can be devastating, there's no doubt. And it's frightening, but we've learned strategies for coping with them that we shouldn't lose sight of.

It sometimes feels as though in this age of information, we have access to too much information. Shocking to think, isn't it? But there is so much opposing information available to a public that may not read science articles and blogs too closely, that they may have trouble deciding what information they can believe and what sources they can trust. All it takes is the click of a button to create widespread panic. Let's say a concerned mom wanted to read a bit about vaccinations before she takes Timmy in for his flu shot. She does a Google search for vaccinations and finds a link for The Vaccination Debate. Immediately, she posts information from this site on her Facebook page and she tweets about it to her network, and when she goes to pick up Timmy from school, she talks to other moms and dads picking up their kids. At home, while doing more research (because now she's worried about all the other vaccines Timmy has had) she learns about a leaked memo with information about the H1N1 vaccine. She doesn't check up on the source, and so she joins the parents who won't vaccinate their children against H1N1 due to concerns about possible side effects from the flu vaccine.

The H1N1 outbreak and the  push to vaccinate specific groups has sparked a furious debate—particularly for health care workers for whom the vaccine was declared mandatory in New York. A New York State judge suspended the regulation in the face of a lawsuit from three nurses who did not want to be subject to what they believe is an experimental vaccine. Their lawyer responded to the ruling with the following statement:
These three women are not saying, "We don’t want to be vaccinated" ... They’re saying: "We don’t need this vaccination. We don’t think, for any number of reasons, it’s effective or necessary. It might be harmful to us. It hasn’t been adequately tested."
The resulting comments from NYT readers have been interesting to read. One reader posted,
This is a victory for personal liberty, but a public health disaster.

While still another stated:
We require that students receive vaccinations before attending public schools, because otherwise we put their fellow students and instructors at risk of an outbreak. Health professionals work in close settings with individuals who may have the flu or may have compromised immune systems. It is the right thing to do for them to receive the vaccination.
And a third added,
In 15 years when you are curious as to why all three of your kids are on Ritalin, and you are having anxiety attacks, maybe you’ll have time to ponder what testing was/wasn’t done on your vaccine.
I can see both sides of the story—I wouldn't want to be injected with an experimental drug, but at the same time, if I'm hospitalized or in an at-risk group, I want to know that all the necessary precautions have been taken to keep me safe. I know that vaccinations against TB have saved millions of people, and Salk's polio vaccine changed the lives of countless others. A troubling aspect of influenza is that it mutates. From 1918 to about 1957, the H1N1 virus circulated in the fall and spring. In 1957, the virus shifted to a new strain, H2N2, and because people had not been exposed to the H2N2 strain previously, a rough flu season ensued. In 1977, the H1N1 reemerged with an altered genetic makeup and more people got sick. Are you seeing the pattern here? Of course, I'm not saying that you shouldn't worry, because scientists really don't fully understand how immunity to influenza work, but we have been through this before. The virus emerged, and because it was new to our immune system, we had no resistance to it and people got sick and some may have died from influenza-related complications. But over time, we developed a resistance to it, and we survived (relatively better than King's population in The Stand)—until the virus shifted and we started the cycle again. All strains of influenza can prove to be deadly, particularly to someone with a compromised respiratory system, but its not the flu itself that causes people to die—influenza is an underlying reason for death.

There is a lot of information about influenza and outbreaks available at our fingertips, but people have to go looking for it beyond a basic Google search, and given the rising panic about the availability of H1N1 vaccines, this doesn't appear to be happening. Our fear of the flu is creating disruptions in daily life activities—for example, sharing food at the office has been discouraged, shaking hands at church during a peace ritual has also been discouraged, and family members are finding themselves unable to visit relatives in the hospital. In The Stand, people start to regard each other suspiciously, and a process of shutting out others begins, so that people ultimately die alone. We've adapted to this type of environment before and we know what measures work to protect us: vaccines, hand washing, covering your mouth when you sneeze or cough, seeing a doctor when you're ill, etc. Of course, by the 19th century, it was clear the Black Death bacterium could become drug resistant, and strike with renewed vigor. Yes, it's frightening. Frightening that in today's age of amazing scientific breakthroughs that something as simple as the flu can aggravate other symptoms leading to death. But we live in an age where we can mobilize quickly and minimize tragedy. We created a vaccine for this strain of H1N1 in record time—which of course raised concerns about its effectiveness. But does it also reflect a disbelief that we could mobilize so quickly?

The evidence in favor of vaccines outweighs the consequences, but that doesn't mean that one shouldn't be cautious of course. Vaccinations do have minor side effects that almost everyone feels, such as body aches, a low-grade fever (under 101 degrees F), and soreness at the site of the injection. I know of at least one person for whom these symptoms masked a respiratory infection that caused him to land in the hospital for three days. But your health care provider should be able to accurately advise which vaccines you need—and if you're looking for more information about H1N1, the CDC has a great site with lots of information for the public (and you may want to talk to your PCP). Another good site for information is Flu.gov.

Share your thoughts on vaccines, the H1N1 virus, the flu, or my questions in the comments box below.

All virus photos courtesy of the CDC.

Technorati Tags: H1N1, pandemic, influenza, virus