How “The Cable Guy” Predicted Our Digital Society

If you ever want a quaint look at the not-so-distant past, spend some time watching or reading old warnings about our culture’s obsession with TV. A few years ago, I did a short write-up about a 1994 Washington Post feature about a family’s addiction to television. It features all four members of the family and their dutifully-followed TV schedules, interspersed with school programs to get kids to watch less TV:

“It is now 9:10 a.m. in the Delmar house. Fifty minutes have gone by since the alarm. Four TVs have been turned on. It will be another 16 hours before all the TVs are off and the house is once again quiet.

By the sink, Bonnie continues to watch “Regis & Kathie Lee.”

At the table, Ashley and Steven watch Speedy Gonzales in “Here Today, Gone Tamale.”

Looking at them, it’s hard to imagine three happier people.

The reason I see the treatment of such a lifestyle for its novelty (or even its dangers) as “quaint” is they were warning about the low levels of inactivity and the social alienation a TV addiction can cause. While not entirely wrong, these warnings for parents and kids seem like people using sandbags to fight a coming tsunami. The Internet has created an overwhelming cavalcade of content that also creates inactivity and social alienation, but it is beyond accepted. In fact, TV itself has largely undergone a re-creation as the most challenging art form of the past decade, with some deeming it “the new novel”.

It’s with this in mind that I found the 1996 Jim Carrey and Matthew Broderick movie The Cable Guy all the more absurd. Its central plot follows a deranged and lonely cable worker (Carrey) going full stalker on an unassuming customer (Broderick). Through deception and an undying devotion to make friends, the titular guy ruins the life of his target.

Near the beginning, when Carrey invites Broderick to see a massive TV satellite dish, he somewhat surprisingly predicts the future of our content addiction (this scene is from the end of the film but Carrey simply repeats the speech he gives earlier):

It’s funny, of course, to find an oddly prescient narrative in what by all accounts is a simple 90s comedy not so out-of-league with other Jim Carrey features like Dumb and Dumber or the Ace Ventura films.

The central point of the film, however, is this nameless cable guy–who only gives TV character names in lieu of his own–is yearning for companionship because he received none as a child. In flashbacks, we see him left by his alcoholic mother in front of the boob tube (would provide clip if I could find it; the film is available on Netflix Instant). So instead of being raised by a caring matriarch, he’s raised by I Love Lucy and Bonanza.

Not exactly a unique narrative; the idea of TV as babysitter has been around since the latchkey days of the 1980’s. However, TV has transformed itself from something to distract your kids with into something to actively do with your kids. It has functionally replaced going to the movies as a family event. Of course, what even The Cable Guy could never predict is the forward market of using apps to pacify children. Whatever the warnings of the 1990s looked like, they simply could not have anticipated tablet holsters for baby seats and the ever-growing market of children’s games on the App store.

Much less could they have known this addiction would spread to adults. To paraphrase Allen Ginsberg, I saw the best minds of my generation destroyed by Candy Crush, starving hysterical Flappy Bird, dragging themselves through the Kim Kardashian Hollywood tips and tricks at dawn.

I say all of this not as some prurient elitist; I myself find my smartphone as addicting as anyone finds their’s. Nor do I even see this as inherently a negative thing: The most popular apps on any major app store are messenging services, meaning we’re connecting now more than ever.

But what does it say that with each method of disconnection presented to us we become more and more comfortable? The future, if Mark Zuckerberg is to be believed, lies with us strapping on VR headsets and immersing ourselves even deeper into realities either completely fantastical (TV, movies, video games) or so distant from our own lives as to be de facto fictional (sports, award shows, news events).

(Note: That last parenthetical got me thinking. Could you imagine being able to strap on a VR headset and be immersed on the streets of Ferguson or the hills of Mosul?)

With every technological development, we find that what “convenience” actually means is being more and more removed from the physical spaces we inhabit and the people we inhabit them with. In his speech, Carrey’s character lists off shopping and Mortal Kombat as two things we in the future can do from the privacy of our own home. But at the time–1996, when the Internet was just about to burst from its nerddom trappings and into mainstream ubiquity–these were things we had to do IRL. There were no other options. Sitting around and playing an SNES with friends was a hallmark of a 90s childhood. Now millions of kids can play Battlefield or Titanfall with each other whilst remaining completely alone. Malls, formerly the center of social life for American adolescents, are dying at record rates as kids find it more comfortable to buy clothes on Amazon and talk to friends on Snapchat, Facebook, or whatever else may come.

Why, after hundreds of thousands of years of cognitive expansion by our ancestors accomplished largely by socializing, is our priority now dealing with as few people as possible? It is a common trope of sci-fi movies that alien invade Earth and find we are killing our planet, that we are the disease to be cured. It almost appears as if we realize that ourselves presently.

Of course, as I hinted at before, most of the technology at hand is actually meant to remove the static of socializing. The text of a conversation on WhatsApp or Kik is direct and clear, free of the trappings of a face-to-face conversation (although also free of the clues physical language can bring). We have found our physical existences to be tiresome and want simply the ego to come across. While the invention of writing and telecommunications made us figuratively closer as a species and culture, it is simultaneously and literally driving us apart.

At the end of The Cable Guy, Carrey’s mad man dangles himself over the same antenna we see earlier in the film. As police choppers surround him, he confronts the cause of his inner need to socialize:

Having realized his intense loneliness is driven by his longing for his mother’s attention, he plunges himself towards the satellite dish.

Broderick grabs a hold of him mid-fall, as Carrey gives the ludicrously well-delivered line “somebody has to kill the baby sitter.” Broderick drops him, and Carrey’s body cuts the feed just as the nameless town learns the verdict of a widely-watched televised trial (presumably meant to echo the then-relevant OJ Simpson trial).

The “babysitter”, as Carrey calls it, has expanded in ways no one has ever imagined. Even the remotest among us are cradled in the embrace of escapism and digital love. Ben Smith, chief news editor at Buzzfeed (itself a warlord for digital consumption) once said “technology is no longer a section in the newspaper. It’s the entire culture.”

And we don’t really seem to mind. In fact, we all seem quite delighted with the prospect of driverless cars, VR adventures, and same-day drone delivery. But much like unattended children, we find ourselves fighting stress and alienation with things that can often enough cause both to force the question: From what are we, all of us in the 21st century maelstrom of voided humanity, hiding? What is all this for?

 

When The War Comes Home

One of the glorious benefits we as Americans get to enjoy is endless war with little direct societal impact. Well, little impact if you discount dead veterans, traumatized veterans, neglected veterans, and a staggering national debt. But compared to most other countries–including the ones we bomb–we as a citizenry can largely avoid whatever atrocities our foreign policy may derive, justified or no.

But on the streets of Ferguson, Missouri, for the past four days we’ve been witness to exactly how that war can reach us in a very real way. People, reporters, and politicians have been shocked, shocked I tell you, to see tanks, armored vehicles, and combat-ready police forces marching on largely peaceful protesters calling for answers in the shooting of unarmed 18-year-old Michael Brown.

Of course, as is usually the case with the mainstream press, it takes one of their own to be directly involved for attention to be deserved.

And just where did all this artillery come from? After more than a decade of war, the Pentagon and weapons manufacturers find themselves inundated with an unused surplus of high-class combat weaponry and found a perfect market in the overflowing police budgets of suburban America.

Credit: NYTimes.com

Credit: NYTimes.com

The above map, from a New York Times report back in June, shows the overflow of military gear distributed throughout the US:

“The equipment has been added to the armories of police departments that already look and act like military units. Police SWAT teams are now deployed tens of thousands of times each year, increasingly for routine jobs.Masked, heavily armed police officers in Louisiana raided a nightclub in 2006 as part of a liquor inspection. In Florida in 2010, officers in SWAT gear and with guns drawn carried out raids on barbershops that mostly led only to charges of ‘barbering without a license.'”

The question in Ferguson, then, becomes not just why the St. Louis County PD–who”s jurisdiction does not include St. Louis the city– is reacting this way, but also what these people were preparing for. Experts and scholars have pointed out that even the looting late Sunday night could’ve been prevented and halted with basic policing tactics. Instead, the police have escalated the situation by breaking out armored vehicles, donning riot gear, and firing tear gas and rubber bullets at everyone–including the press.

What we now have is a police culture driven by paranoia and worst-case-scenario preparedness that has never been seen. Crime across the country is down and, despite mass media coverage, even so-called “active shooter situations” are increasingly rare. The above map shows Oklahoma, for some god-forsaken reason, stocking up on military gear more than any other state. Meanwhile, crime-ridden Detroit can’t respond to a 911 call within an hour. Your local, small-town police force is ready for a war that isn’t coming.

While an argument of “better safe than sorry” could surely and has surely been made for the militarization of police forces (see the literal occupation of Watertown, MA after the Boston marathon bombings), what it provides police forces with is the increased ability to overreact. When you strap flak jackets and grenade lines on someone, they’re going to behave as if they’re in a war zone because they certainly must feel they are. If the most powerful weapon you give a security guard is a plastic badge and a can of mace, he’ll only pull the mace out in extreme circumstances. When the most powerful weapon you give a cop is an Abrams tank, suddenly the tear gas and stationed snipers don’t seem so extreme.

This is how the imagery of the wars we’ve waged have been brought onto our streets and in our neighborhoods. A flood of post-war profiteering met the general paranoia of post-9/11 America and had the despondent lovechild that is militarized police forces.

Why You Care About Oberyn Martell And Not Lorna Wing

(Note: Spoilers Ahead for Game of Thrones fans)

Don’t feel entirely guilty for not knowing who Lorna Wing was. Wing, who was 85, was the founder of the National Autistic Society and the inventor of the term “Asperger’s Syndrome” (not, as many are mistaken, Hans Asperger). Having an autistic daughter herself, she devoted her life to understanding the disorder, popularizing the notion that autism exists as a spectrum instead of as a finite point. Her 1981 paper Asperger Syndrome: A Clinical Account, remains one of the most thorough analyses of ASD, setting what would be the diagnostic establishment for autism. While her accomplishments are of innumerable benefits to society, especially families dealing with autism, Wing was obscure as a public figure, netting no magazine covers or Top Trending status.

She passed away on Friday, June 6. Five days prior, millions of American households watched Oberyn Martell have his life struck from him in dramatic fashion. Martell, Prince to the Dornish throne, was struck down in a duel with Ser Gregor Clegane. Often known as “The Red Viper”, Martell was simultaneously championing the supposedly-regicidal Tyrion Lannister whilst also seeking vengeance for the plunder and murder of his younger sister, Elia, when Clegane confessed to the murder while crushing Oberyn’s skull.

The difference between the Prince of Dorne and the founder of the National Autistic Society is pretty clear; one is a very fictional person from a very fictional world, and one is a hero of medicine and psychology. One was curated by the deliciously-twisted mind of George R. R. Martin, and the other was born, had children, then died.

Yet asking why millions mourned Martell and not Wing is also an obvious question: they didn’t know Wing. Despite having the advantage of actually existing, Lorna Wing did not have a chance of being nearly as famous as Martell. We watched (or read of) Martell’s whoring, his dalliances, his eight bastard daughters. We watched him fight for a character, Tyrion, whom we’ve known far longer and cared for even more, as well as in the defense of the honor of his sister. Protecting two innocents in a literal trial-by-combat, watching him die in such brutal and shocking fashion had an emotional toll.

Game of Thrones and the A Song of Ice and Fire novels are the perfect venue to learn how we connect with and mourn for fictional characters. Martin, the author of the fantasy series, kills characters with such flippancy and frequency the mantra “All Men Must Die” (or “valar morghulis” if you’re privy to your High Valyrian) might as well be the official slogan. You can mourn every character death (including the direwolves) at digital graveyards. The  coffers of Youtube are full of “reaction videos” wherein we see people crying real tears for fake people.

And Game of Thrones is different from similarly passionate shows like Breaking Bad or The Sopranos. Walter White existed in our reality. He was a guy who drove an Aztec with a missing hubcap and worried about his medical debt. Danaerys Targaryen doesn’t exist in anything like our reality. Arya Stark doesn’t know what a middle school is, but there we are, rooting for this 12-year-old to murder.

So where do our feelings toward fictional characters blend with our feelings toward real humans? We say we mourned Oberyn Martell more than Lorna Wing because we knew Martell more. How do we “know” a fictional character?

Sociologists and Malcolm Gladwell-types will be familiar with Dunbar’s number. Based on research he did in the early 1990s, anthropologist Robin Dunbar proposed the theory that human’s are limited by their brain capacity in how many social connections they can usefully make. Studying how social connections are processed in the “orbitomedial prefrontal cortex” of the brain, Dunbar concluded the “mean group size” of any average human is 148.

What that means is the average person can only make and maintain 148 social connections, be they friends, family, co-workers, teachers, or your dentist. Sure, you can certainly know of more people (as your Facebook friend list may tell you), but in total the average limit of people you can truly know is set at 148.

In fact, Dunbar has updated his research in this area to include the world of social networks, wherein it’s not so odd to have well over 500 “friends”. He found what most of us already knew: Your friend count is pointless vanity. In fact, since the average adult Facebook user has over 300 friends, over half of your connections on Facebook are pointless.

Does this neurological limitation extend to how we feel about fictional characters? It’s hard to say. In a Cornell study from 2012 on the social networks of mythological characters, two researchers built a theory for determining whether a storyline about characters was based on real events or not (simply by analyzing how the characters interact and whether it mirrors how real human connections behave). So while there is validity in relating how humans interact to how fictional beings interact, it’s difficult to say what happens between the two worlds.

In her book apty titled Why Do We Care About Literary Characters?, Brit-lit scholar Blakey Vermuele posits that we might actually learn more about fictional characters than we do the people around us. Because of the narrative focus of certain books, we get a front-row seat to another person’s psyche when we read a novel; something you may never get from your 148. This is also why reading fiction can make you more empathetic.

When a personal connection is driven into you by an author’s sheer force of will, it’s emulating how you would feel if these characters were real. But does emulation equal reality when it comes to your prefrontal cortex? Does the social part of your brain prepare you for human relation with figments of someone else’s imagination, or does it merely think books are training for the real deal?

The core part that’s missing, of course, is that Oberyn Martell never knew you. Because you only know of Oberyn Martell or Lorna Wing or whoever the next big celebrity death is from a viewpoint of consumption, it is not a social function. While they can change you, you can never change fictional characters. It’s this void that fan-fiction aims to fill. Even in the most interactive video games, there is very little a player can do to change the intents and attitudes of the characters around him (should probably say that I’ve never played Heavy Rain but my understanding is you can effect events and not characters). So even when we become a character in the storyline, we’re still short from it becoming a social experience–the necessary requirement for that prefrontal cortex activity that would drive you towards your 148 limit.

 

 

 

When The Symphony Stops

This week, I wrote an article for The Daily Dot about the fantastic Spike Jonze film Her and it’s subtle commentary on what futurists call “the Singularity”, that point when humans, through advancements in artificial intelligence, will use technology to surpass the earthly bonds which limit our potential.

It admittedly sounds like a lot of New-Agey nonsense, but the idea that we can, at some point likely in this century, download our consciousness onto a hard drive is quickly gaining traction. None other than Google itself has invested heavily in the idea.

What this would look like from a UI–as opposed to AI–perspective is troubling. If Ray Kurzwell is right, and we’ll have this ability by 2045, I may likely live to see this technology at an age that I would most need it. What the technology would gift us is a chance to design and choose our own afterlife of sorts, while still being able to communicate with the outside world in a digital manner. It would even allow us to expand the limitations of our consciousness, enabling us to accomplish mental and intelligence goals far outside what we can perceive in our current puny human state.

However, if we can not only simulate consciousness in a digital sphere but actually replicate an already existing consciousness, does this not disprove the existence of the soul?

Let me explain. Most religion is based on the idea that our consciousness is not a material attribute but a spiritual one. Unless you’re prone to believing we can transmit spiritual essence onto non-living physical objects (like a hard drive), then what would be left behind when we switch off the brain and compute the electric synapses of our consciousness onto a hard drive?

The answer, I suspect, would be not much. If you drain the brain of the electrical movements which keep not only our memories but also those inherent activities that allow us to function as biological entities, we’ve left a dead body for a vibrant one.

The other option is we only transmit those things in our conscious mind, leaving behind the subconscious actions, which also raises interesting questions for repressed memories or subconscious-conscious activity, like the hidden secrets which cause mental illness. Would depression travel with me onto the computer?

But back to the soul. I, philosophically, am what is called a “Materialist”, one who believes there are no abstract concepts in the Universe and the things we call love and hate and envy and gratitude are only the firing of synapses, a projection of the continually mysterious functions of the brain.

The best way I’ve heard this said comes from an anonymous post on a forum since lost to time: “Asking what happens to the soul when the brain stops working is like asking what happens to the music when the symphony stops playing.”

Yet even that metaphor, in the context of the Singularity, is amiss. If I record the symphony, I cannot re-arrange the brass and the woodwinds into brand new music entirely. Consciousness allows us to alter who we are and digital consciousness would extend that ability possibly into infinity.

But the digital space we’d be living in would have a design, the UI I mentioned earlier. What do you want to do for eternity? You can always change your mind. You can meditate, learn, play, fuck, do whatever you want, really.

But who’s on the outside of this system? Who do you trust enough to hold the small, black box that contains you?

Which raises even more troubling questions. Could this not open the door to punishments Caligula could only have dreamt of?

In 2011, three Somali pirates murdered four American civilians aboard the shipping vessel Quest. Captured and tried in the United States, the three young men were sentenced to “21 life sentences, 19 consecutive life sentences, two concurrent life sentences, and 30 years consecutive.” This nature of sentencing practice is typically used to be sure that the defendants in question never achieve parole, having to prove to a parole board over 40 times every 25 years they are capable of existing in society. Assuming they get parole, that puts their release date at over 1000 years from now. The only escape from their imprisonment is death.

But what of the digital sphere, where death is theoretically avoidable? Could we not trap these three gentlemen’s conscious minds into digital dungeons, possibly even slowing down their perception of time to the point that they serve 1000 years in a dungeon in an afternoon?

Consider the philandering husband who entrusts his hard drive to his estranged wife. Could she not enact the most severe of revenge fantasies?

Consider the Westboro Baptist Church. Could they not devise a realistic portrayal of Dante’s Inferno for those they consider the most vile of sinners? Entrapping possibly even strangers in horrific pain and agony for eternity?

What this technology, for now a hypothetical, would do is not just disprove the existence of the soul but finally adhere to the conscious mind all the ethereal qualities of a soul. We could finally invent the afterlife we have seen for ourselves since the dawn of the conscious mind. For so much of human history, we have wondered at ourselves and assumed a greater being must have endowed us with such ability and insight over the beasts of the field and the birds of the sky. If only we knew we are the gods we’ve been waiting for.

A Message To DC Journalists: Shut Up And Write

Honestly, just shut up.

You are journalists who not only chose your career path but chose to go to DC to cover politics. You worked very hard on deadline after deadline and have been rewarded by getting a beat that involves cocktails and parties and long, boring hours filing reports in crowded booths. You are at minimal risk of being kidnapped or shot. Hazmat suits are not part of your wardrobe. You are not Richard Engel. You are not even Anderson Cooper.

Not to say that your life must be at risk for your work to have value. DC journalists play–or at least should play–an important role in covering policy, and I’d even give credit to personality pieces for helping the average person understand the interpersonal politics at play in otherwise wonkish debates like the budget.

But shut up. Shut up about DC being a corrupt playground for the rich. Shut up about your existential crises over going to too many parties and screwing too many girls. Shut up about your constant, self-driven guilt over giving more coverage to Barack Obama’s selfie than poverty or war or, y’know, stuff that might actually affect people. You are not F. Scott Fitgerald in NYC or Hemingway in Paris. You are Max Tucker at a titty bar. Shut up.

You are spoiled. You are pampered not by the standards of truckers and lumberjacks, but by the standards of journalists. You are surrounded by news all the time and yet whine about it. You are staff writers at some of the biggest-budgeted publications on Earth. Most of you can get published anywhere short of National Geographic. Sure, maybe you remember your hard-scrabble days of chasing leads and sending out clips. But guess what? You escaped the Gravity-esque life of grabbing at any damn thing and hoping someone will pay you for it. You are given actual money–actual, liveable money–for your thoughts on whether Santa is white or not. Shut up shut up shut up.

I love you guys. I do. I love John Stanton’s gruff sincerity. I love Dave Weigel’s self-aware sneer. I love Zeke Miller’s attention to detail and The Hill’s policy-centric approach. I even love Politico for its self-endowed lack of seriousness, the campus zine of DC. It’s fun and informative; good and good for me!

But shut up. Now that DC has been belly-up for roughly half a decade, now you decide to make an attempt at a conscience? Now you feel the need to navel-gaze and make a change? Go right ahead and take Sam Youngman’s advice and drop your Twitter and avoid the bars and the parties like a real goddamn bama. It will change so much nothing. It will not touch any single human being in any meaningful way. It will not make the news better and it sure as hell won’t make the government better.

You’re being brats. You just now feel like a generation ascendant, a real gumshoe attache to the grand tradition of getting drunk with lobbyists. You are no where near as important as you seem to think and it’s all your own damn fault. You descend into the lunacy of fake controversies. You trust people in charge to tell you what you want. You trade favors for access, fluffy and humanizing reporting of slugs so the average person can feel everything’s alright, Ted Cruz’s father fought Castro, everything’s alright.

I know you’d probably like to think your buddies up the I-95 corridor made journalism the sad sack of shit it is today, but no. You’re equally responsible, if not more. You’re the reason we live in a constant election atmosphere, one filled with hyperbole and filth. You’re the reason Donald Trump could go on TV and make his face apparent. You’re the reason Paula Deen became a national crisis. You’re the reason George Zimmerman is a national hero/villain. You expanded the arguments of “This Town” (which you say with such damn pomp) into the daily lives of millions who just want to find out if they’re paycheck will be smaller or larger next year. You sensationalize rubbish, abuse tragedies, and bleed out the very optimism that could save a generation from ever becoming you.

So please, please, please. For the love of all truth and hope, for the sake of not “This Town” but “Your Town”, shut up. Stop complaining that a city you make is too horrendous for you. Quit your bitching and do your damn job.

A Book, Sex Scandals, Dopamine, and Narratives

Believe it or not, I think it would be good for my writing if I began a collection of essays to become a book. I have trouble writing for extended periods of time because I usually feel that the fewer words I need to express an idea the better. The other complication of this is that I am not a policymaker, a known intellectual, on TV, or funny. Usually, the good essayists are one of those four things. David Rakoff was never that funny, but he was to some people. As is Sarah Vowell, David Sedaris, Chuck Klosterman. I struggle with this because I rarely have to fight to make people laugh in person. I’m an excellent conversationalist and terrible humorist.

The other detail to consider is what the hell I’d write about. I’ve often thought I’ve lived a life that could blow Running With Scissors into an oblivion, but again, it’s hard to sit and think of how to make your father’s death or your mother’s alcoholism or your own substance abuse into a compelling narrative.There is certainly a market that will voluntarily subject themselves to horror stories of the modern world, Angela’s Ashes of the digital age. But I’ve never been certain that’s the kind of story I want to tell. I’ve written some awfully depressing things for Thought Catalog, some real tearbait, and I didn’t have to lie or exaggerate to do it.

Then there is the problem of Milton Hershey School. I could tell some stories based on conjecture and rumor, but I would have to represent them as such.

For example, I knew a kid named Eric who, whilst in the 8th grade, raped a 6th-grade girl on campus. He was made to disappear, presumably back home with the vital ideal that he can rape with impunity (allegedly). There was a student home called Moldavia where two brothers were not only fucking each other but raping some of the smaller boys in the home (allegedly). Then there was this very real, non-rumor story about the son of a substitute housemother who raped and/or molested at least five girls while his mother worked for the school (whilst also owning over 700 images of child porn). Once more, MHS paid over $3 million in settlements to the families.of the girls and boys affected by another child of a houseparent who was likely using kids at MHS to produce child porn (note that, yes, that link looks shady but it links to a now-nonexistent philly.com story). And all of this was occuring while I was there, between 2002 and 2007, so obviously some fucked-up level shit was going on that could make a few good essays.

I realize exactly how crass that sounds, but when writing about my life I’m monetizing terrible things that happened to me as well. Not Sandusky-level terrible things, but utter neglect, social services, homeless shelters, and a general level of chaos unfit for any child. So how much should I focus on, for instance, my mother’s drinking habit? Do I have an adequate number of anecdotes about her tumbling around or getting sick? Do I want to adapt it to a CW format or simply write straight out how she would often forget to get us dressed and to school? And what thematic sense does the awful condition of our infested home have?

So I just took a break, retweeted a tweet (this one), looked at a Buzzfeed article (this one), and am going to stop being so cynical. So, yeah. Essay book. What I really want to do is combine these personal worlds with the sorts of things I do for The Daily Dot. I don’t mind Personal Woe Is Me Ben, but I would prefer to be Tech Pundit Ben. I like Tech Pundit Ben. He’s informed and has all the links in the right places and is certain about what he says and is rarely accused of being egotistical or brash. I often find myself contradicting previous ideals I’ve sworn up-and-down are real because, in conversation, I attach myself to not what I think is right but what I find most fascinating. My discussion and writing are far less concerned in hitting upon a truth and far more concerned about locating narratives.

I’m currently reading The Black Swan, a fantastic book about recognizing preconceptions and how they hurt us in preparing–or even identifying–the unexpected. Nassim Nicholad Taleb leans rather heavily on narratives, saying they are of little worth and often lead us to misdirect our attention away from key or contradictory facts and conclusions. This is true: I get exasperated watching MSM wrestle a story into a narrative, often with misleading connections and, at worse, false accusations. Fox News is particularly good at this, and little of the political game around it actually matters.

So why do I like locating these narratives? Why do I enjoy finding patterns where, at least some of the time, there is none. I’ve written articles about dogs on Reddit and how they relate to overall maturity of my generation and their lack of need for things like family and responsibility. I have a fucking problem.

Turns out, as The Black Swan reports (as does NewScientist), the University Hospital of Zurich has found that people with high levels of dopamine place more importance on coincidences. The difference between a coincidence and one-thing-leading-to-another is insane. Most trials rely on this distinction, be they burglars or warlords. It speaks to causality, an important factor when attempting to find narratives within the real world. I forget where, but I once read that the difference between a fact and a story is “the king died” and “the king died of grief.” Adding cause to an event is the birth of the story.

The idea that our ability to identify or even appreciate narratives is attached to dopamine levels has some rather far-reaching implications. Higher dopamine levels are also consistent with depression, drug addiction, ADHD, and psychotic disorders like schizophrenia. Many of these symptoms rely on constructing and believing in narratives about ourselves; if I go out tonight I just won’t have any fun because I’m not liked by anyone says the depressive. I deserve this cigarette because I work hard says the smoker. These are not just major assumptions but even falsifications which enforce a narrative our brain finds pleasing. Hell, since a drug like marijuana increases the level of dopamine your brain creates, it could possibly even explain why potheads are prone to seeing patterns while high they may not while sober, such as syncing up The Wizard of Oz and Dark Side of the Moon. That example attaches meaning to otherwise disconnected events (Dorothy looking around when David Gilmour sings “look around”). But then again, I have no empirical evidence to back up this statement, only circumstantial conjecture.

So when I say something like “Twitter redesigned their website to have a more subtle (and therefore more successful) IPO than Facebook by impressing advertisers”, I am assuming THOUSANDS of single events. Yet I hold other mediums fully responsible for this same mistake. Then again, my writing is (usually) ensconced with the big, ugly word “Opinion”. And it is “Opinion” (though I prefer “Analysis”). The difference between opinion and fact is what you know and what you assume.

I assume Jack Dorsey okay’d the redesign’s aspects to make ads more visible and even more enticing, and did so a week before the IPO to show off his yet-to-be-profitable website as potentially more profitable, something Facebook never felt the need to do until after their IPO, simply because Facebook was already profitable. That is as much a guess as saying “my mother ran away from home because her parents were strict.” I have no actual idea if these are true and no facts or experiences to enforce them, other than facts which seem to imply it. Viola; the beginnings of a narrative.

The Protester Becomes The Protested

Back in 2004, a common image dragged out by both the DNC and the RNC was of John Kerry testifying against the Vietnam War before the Senate Foreign Relations Committee, the same body he spoke to today in favor of “limited strikes” against Syria by the US. He was undoubtedly a protester at that 1971 hearing, a signed member of Vietnam Veterans Against The War with a hero’s record and demeanor.

Today, as he and Secretary Chuck Hagel asked the Senate to please pretty please let them bomb Syria, irony became the overarching theme. Oh, Secretary Kerry promises there’s proof of WMDs? And that the war won’t last long? The man who ran against Bush on the exact opposite platform less than a decade ago is now promising me this?

However, things became a great deal more odd when a Code Pink protester very predictably interrupted the hearings. While so common as to be nonevent, Kerry’s response to the protester caught my attention:

“The first time iI testified before this committee when I was 27 years old, I had feelings very similar to that protester, and I would just say that is exactly why it is so important we are all here, having this debate, talking about these things before the country. And that the Congress itself will act representing the American people. And I think we all can respect those who have a different point of view, and we do.”

Responding to the protester is the classiest way out for any politician. Witness the difference between Obama responding to a heckler in his crowd–even going so far as to validate their concerns–and Mitt Romney staring steely-eyed as he talks over the protester with his scripted speech.

There are two schools of thought on responding to hecklers as a politician. You can attempt to march on with your speech or your testimony with little to no deference to the meager shouts, or you can stop and give a brief response to the protester, perhaps even dropping some praise for First Amendment rights.

The move-along approach does two counterproductive things. First, it denies attention to the protester. Second, it validates the protester’s concern. Every protester’s base concern is they are not being heard, and being the one in charge making the decision to ignore them on camera, you are making at least the protest, if not the content of the protest, seem to have a genuine and serious purpose.

The confrontational approach has several downsides, but also some large upsides. If you’re warm and welcoming to the dissent, you’ll likely not have much to fear. Throw in a few good words about free speech la di da and enjoy the applause. Motion to have the prompter reeled back and continue on. However, if you aren’t great at going off script or merely can’t keep your lid on tight enough, this can be a dangerous game. See Michelle Obama’s blowup at a protester at a fundraiser just last June, which made her seem weak and defensive. Or Chris Christie, who humbly believes he can embarrass his way to victory in an argument as he regularly thrashes hecklers to the side, gaining him a reputation as a blowhard (or “strong conservative” depending on your viewpoint).

Secretary Kerry has entered this arena before. Just last January, at his confirmation hearing, Code Pink greeted him with the same tactics we saw earlier today (i.e. wearing pink and shouting). Kerry gave an astonishingly similar response as he did today:

“When I first came to Washington I came here as part of a group of people who came to have their voices heard. And that is above all what [the Senate] is about. So I respect the woman who is voicing her concerns about that part of the world.”

Of course, nothing could match the Heckler To End All Hecklers, Andrew “Don’t Tase Me Bro” Meyer, who confronted then-Senator Kerry during a speech he delivered to the University of Florida in 2007. In response to the now famous incident, Kerry;s office issues the following statement:

In 37 years of public appearances, through wars, protests and highly emotional events, I have never had a dialogue end this way. I believe I could have handled the situation without interruption, but I do not know what warnings or other exchanges transpired between the young man and the police prior to his barging to the front of the line and their intervention. I asked the police to allow me to answer the question and was in the process of responding when he was taken into custody. I was not aware that a taser was used until after I left the building. I hope that neither the student nor any of the police were injured. I regret enormously that a good healthy discussion was interrupted.

One could even say this was the beginning of Kerry making a point to address protesters and congratulate them on their role in the democratic process. When you stand idly at a podium as police and campus security hold a man down and shoot electricity through his system, you can feel a bit obligated to make it up.