on being behind

My calendar is a masterpiece.  I keep up on birthdays and pay my bills before they’re due and go to the doctor regularly and spend small fortunes on preventative maintenance. I’ve never been responsible for missing a flight.

I reckon that none of this would come as a surprise to people who know me at work.  As a professor, I’m punctual and reliable.  My classes begin and end at the appointed minutes.  I generally respond to student emails within a day.  I return graded papers promptly, and sometimes earlier than promised.  During my short stint this semester as an interim department chair, I handily kept the proverbial trains (and most of the meetings) running on time.

But otherwise, I am so behind.  When someone asks me how I am, “so behind” is an increasingly common answer (along with “fine” or “tired.”)  This behind-ness is concentrated heavily in my research agenda.  Of course, I meet deadlines in all but the most extraordinary circumstances, and sometimes even then.  But I still have this sense of being off, of lagging, of inertia, of delay.  Relative to what, I’m not sure.  Maybe nothing. But it’s there, and insistent, nonetheless.

Admittedly, outside the realms of obligations and have-tos, I’m often running late.  I text en route apologies for my tardiness.  I scoot into the yoga studio moments before the teacher hangs the “class in progress” sign on the door.  If restaurants and salons didn’t have grace periods on reservations and appointments, I’d have to do a lot of rescheduling.

But my experience of “behind” is different.  “Late” is concrete and quantifiable.  “Behind” is abstract.  “Late” is preventable with basic adult skills like time management and learning from past mistakes.  “Behind,” at least as I experience it, does not have a behavioral fix.  Lateness is basically a function of the laws of time and space, of their constraints.  Behindness, on the other hand, doesn’t have much to do with logic.  Instead, it’s a backward-looking perception of all that I didn’t do, and should have done, while time passes, heedless of my agenda or efforts.

white-rabbit
I’m late, I’m late, for a very important date.  With my research agenda.

I’m not alone in this, I don’t think.  My academic pals voice similar complaints.  “I am totally caught up on everything,” said no academic,  ever.

A couple of years ago, I read Sarah Sharma’s remarkable book, In the Meantime: Temporality and Cultural Politics.  Sharma develops the idea of “power-chronography” to capture the saturating overlay of power onto time in the domains of labor, embodiment, and social life. It would be absurd to compare the work that I do to that of the taxi drivers, for example, that she studies extensively; my situation is much more comfortable, far less precarious.  But all of the people that she describes in her book have jobs that vex their experiences of time.

Professor-ing has its own temporal idiosyncrasies.  We get a lot of breaks and time “off,” but this also means that we make a lot of transitions.  The interval between writing a thing and seeing it in print is yawning, and variable.  Every semester, in the classroom, we start from scratch.   Our workflows depend, in fundamental ways, on the whims of dozens of 18-year-olds.  Set arbitrarily to a 5-to-7 year cycle, tenure clocks tick, or get stopped.

In various ways, academics talk a lot about our time.  We extol the virtues of winter and summer breaks or sabbaticals, lament how fast they seem to go.  We note, appreciatively, the relative flexibility in our schedules.  We lament or humble-brag about how busy we are.  We evaluate ‘work-life balance’ and our allocations to each side of the hyphen.

These are all relatively tangible things.  “Behind,” on the other hand, is not.  Experiencing oneself as “behind” is, I think, central to the temporal experience of academic work.  It arises at the intersection of whatever psychic characteristics predispose us to pursuing this career and the practices of the institutions where we enact it.  In my own life, it underpins writing guilt (because if I was working instead of enjoying myself, maybe I wouldn’t be so behind).  It inspires me to say “yes” when I should probably say “no” or “not now” (because it’s more appealing to agree to a new project that I’m not behind on yet).  It prevents me from appreciating what I do accomplish (because the focused effort required to finish one thing left me even more behind on everything else).  And so “behind” perpetuates itself.

And I’m not sure what to do.  I work to capacity most days, while still endeavoring to preserve a life that is tolerable and meaningful outside my office.  So  I think about those workplace renegades who, finding themselves hopelessly deluged with email, simply zero out their inboxes with a wild “Select All” and “Delete,” on the assumption that if there was anything important, the sender would write again.   Theoretically, I could do the same thing with my calendar, tear out those old pages full of their undone to-dos.  Clearly, none of them were absolutely essential or really time-sensitive, so I’m not sure what I’m holding on to.  But somehow, I can’t bring myself to do it.

So I work suspended between my past expectations and hope, endlessly deferred, for a future in which everything is current, and I am always right on time.

“it might be, it could be, it is …”

Right around this time last year, I wrote about the affective pedagogy embedded in the work of loving the Chicago Cubs, how it might condition a fan to a form of hope disconnected from optimism and expectation.  And I mused that such an orientation might provide a meaningful alternative (really) to forms of forward-looking attachment that leave us perpetually disappointed, individually, and turn us into grasping subjects of neoliberal capitalism.

This year, everything looks different.  Not the neoliberal capitalism part: that’s still there. But the Cubs part, yes.  Last year, the Mets swept the Cubs in the NLCS.  This year, the Cubs are holding their own in the World Series.  This postseason, when people talk about ‘underdogs,’ they are emphatically not referring to the Cubs.  Crazy.

I grew up listening to Harry Caray call Cubs games, and he often narrated the flight of a long fly ball in anticipatory, increasingly exuberant stages: “It might be … It could be …” and then, if the shot cleared the outfield ivy, “It is!  A home run!”  Of course, most broadcasters of that era had such signature lines (my second, charmingly nonsensical, favorite is from Baltimore: “Go to war, Miss Agnes!”).  But it strikes me now that there is something essentially Cubs about “It might be, it could be, it is.”  To the extent that this triptych still resonates for Cubs fans, I suspect that the affective structure I wrote about before, and all of its promise, remains.  img_1863

In a recent op-ed for the New York Times, Rich Cohen reflected on a lifetime of Cubs disappointments, and speculated that so much might have been different if he’d grown up watching a winning team.  “Maybe,” he wrote,  “I’d have learned to cherish my fellow man and take yes for an answer and accept all the love that’s been showered on me.”  He also suggested that if the Cubs keep winning, Cubs fans will start losing their identity: “being a Cubs fan always meant something and now will mean something else.”

But I’m not sure that affective histories dissolve so quickly.  It’s true that Pollyanna-ish observers might take a still-very-hypothetical World Series win as a sign that good things inevitably come to those who wait, or some other such rubbish.  And it’s also true that lots of ballpark fans have been waving signs proclaiming that the Cubs, and Chicago, “deserve” to win it all this year.

I, too, am enchanted by the story of this postseason, mostly because it makes me feel like things could be other than they are, or have been, like something new is possible.  But “could be” is all I can count on, and really all I want.  “Could be” is the glittery essence of possibility.

And possibility forms the core of this affective magic.  It’s not the same as likelihood or even probability.  Possibility lives in might be-could be suspension.  Caray’s gruff singsong was not “It might be … It will be” or “It might be … It must be” or “It might be … It should be.”  Just might, and then, a few fractions of a second later, a little more surety but still no promises: could.  The very literal-minded might say that such subjunctive hedging is necessary because Wrigley Field is unpredictable, what with the wind and all.  Sure.  But that’s not really the point.

Sometimes, Caray was wrong: might be, could be, isn’t.  But that inaccuracy is as much a part of the field of possibility as other, jubilant, instants of rightness.  Possibility, at its most radical, entails unpredictability.

Contemporary systems of threat-assessment and risk-management expressly target unpredictability (see Louise Amoore’s The Politics of Possibility for a brilliant analysis of this phenomenon): in security, in markets, in human behavior.   Their preferred modes are anxiety, prediction, and preemption.  They cannot abide the space between “could be” and “is.”  Far too risky.  Just an inkling of “might” and they activate, begin engineering the unexpected away.

Obviously, sports fandom operates on a different, and arguably trivial, register.  But at our present, and wearisome, juncture, I’ll take an alternative wherever I can find it.  The Cubs have made it this far; we’re well past “might” and holding our breath in “could.”  “Is” would be awesome.  But if it doesn’t happen this year, there’s always next, or the one after that.

Possibility is endlessly renewable.  And to dwell in it, even for a few weeks in the fall, is to refuse the twin certainties of “won’t” and “will” and all the potentials and pleasures they foreclose.

 

a journey of 8500 miles begins with …

IMG_1393

…  a layover in Newark.

I’m on my way to Hong Kong for “Imperial Benevolence: U.S. Foreign Policy in American Popular Culture Since 9/11.”  Thanks to the organizers for the tantalizing invitation!

I’ll be discussing a new project called “Imperial Cry-Faces: Women Lamenting the War on Terror.”  You can click here for an abstract.

And here’s a preview of the first couple of pages …

“It’s okay,” she says through her tears, patting the bald eagle on the head.  She has shifted her torch and tablet to the crook of her left arm, and stretched out her right to console her feathered friend, who weeps with his talons wrapped around a flagpole extended over the waves.  This crayon drawing of a crying Statue of Liberty—by an elementary school student named Eddie Hamilton from Knoxville, Tennessee—is held by the Library of Congress as part of its September 11, 2011 Documentary Project.[i]  But young Eddie was far from the only person to imagine this kind of emotional life for the cast-iron woman.  One of his classmates imagined her similarly distraught, and so did a number of political cartoonists, along with more than a few tattoo artists (as I discovered serendipitously during an internet search).  But if crying for the victims of the September 11th attacks is so necessary and so automatic that even a statue can do it, the appropriate emotional response to the wars that followed the attacks is much harder to discern.  In my current book project, entitled Figuring Violence: Affect, Imagination, and Contemporary American Militarism, I trace the currents of affection, admiration, gratitude, pity, and anger that circulate around privileged objects of sentimental investment: children, military spouses, veterans with PTSD and TBI, detained enemy combatants, and military working dogs.  Here, however, I ask a different question: who cries for U.S. empire?  Perhaps the passing of time and the balms of revenge and pre-emption have offered Lady Liberty the same comfort that she extended to the eagle.  But apparently not everyone can survey the landscape of contemporary American militarism with so stiff an upper lip.

Accordingly, this paper maps the intersections of gender, sadness, and imperial violence as embodied by the crying female protagonists who populate the American media landscape of the Global War on Terror (GWOT).  The ruthless interrogator who weeps quietly at the end of Zero Dark Thirty, the drone operator whose eyes spill over during every strike in The Good Kill, and the CIA agent who sobs theatrically all the time, over everything, in Homeland: these women do the lethal affective work of empire.  And it makes them feel bad … not necessarily bad about it, but certainly bad around it.  My goal here is not simply to analyze these representations of emotionally frail female warriors; rather, I want to consider the political and emotional complexities of their crying.  This inquiry emerges from my abiding curiosity about the role of emotion in contemporary American militarism and, more specifically, my skepticism about the capacity of sentiment to challenge it.  Marita Sturken has argued that in the aftermath of September 11th, “the paradoxical effect of the nation under threat is that modes of sentiment that might have been perceived as weakening its stature become the terrain through which it is recuperated.”[ii]  In this way, feeling bad for the victims of U.S. imperialism coexists easily with ideas of American exceptionalism, on the logic that only so enlightened a nation would be sensitive enough to lament its casualties.

Two assumptions about the act of crying, in general, inform my analysis here.  First, tears do not always lend themselves to interpretation.  Anyone who has ever tried to soothe an inconsolable child or has found themselves crying without really knowing why understands this intuitively.  Tom Lutz, in his singular volume on crying, identifies this inscrutability at the heart of the interpersonal dilemma that crying poses, because crying appears to be such an insistently communicative behavior.[iii]  Second, tears, like any other emotional phenomenon, have both individual and structural origins.  Ann Cvetkovich, in her work on depression, raises the possibility that systems like neoliberal capitalism, along with war, states of exception, and intense securitization might manifest in individual depressions.[iv]  Working from these premises, my focus here is not so much on the narrative contexts in which these female protagonists cry, but rather on how their crying might register the historical moment from which these texts emerge, and what kinds of affective pleasures and pedagogies they might offer their audiences.

The act of crying, at least in contemporary Western cultures, is gendered feminine.  Lutz notes that in canonical depictions of crying, like literature or epic poetry, men cry, but predominantly about matters of state, like war, peace, and political ideals; women’s tears are reserved for the personal.[v]  Yet the films and television show I analyze here deviate from that pattern, at least partially, as all the female criers emote for reasons that cannot be reduced to individual woe (though many critics, both within the diegetic universe of the pieces and in commentaries about them interpret their crying as signs of personal weakness).  Elisabeth R. Anker’s work on the ascendance of a melodramatic style in American politics since the mid-twentieth century suggests a pervasive emotionalism in U.S. policy, both domestic and foreign.  The melodramatic style, as Anker describes it, “casts politics, policies, and practices of citizenship within a moral economy that identifies the nation-state as a virtuous and innocent victim of villainous action.”  She continues: “By evoking intense visceral responses to wrenching injustices imposed upon the nation-state melodramatic discourse solicits affective states of astonishment, sorrow, and pathos through the scenes it shows of persecuted citizens.”[vi]  Melodramatic political discourses, like melodrama itself, are driven by the affliction of the innocent and the helpless; translated onto the nation-state, they “draw upon a moral economy that locates goodness in national suffering, and that locates heroism in unilateral state action against dominating forces.”[vii]  Hence the weeping Statue of Liberty.  Conversely, the crying ladies of Zero Dark Thirty, Good Kill, and Homeland cry from positions of state-sanctioned power.

[i] Eddie Hamilton, “It’s OK,” 2001, Library of Congress American Folklife Center, AFC 2001/015: gr015d.  Available at https://www.loc.gov/item/afc911000239/.

[ii] Marita Sturken, “Feeling the Nation, Mining the Archive,” Communication and Critical / Cultural Studies 9, no. 4 (December 2012): 353-364, quot. 357.

[iii] Tom Lutz, Crying: The Natural and Cultural History of Tears (New York: W.W. Norton & Co., 1999), 19.

[iv] Ann Cvetkovich, Depression: A Public Feeling (Durham: Duke University Press, 2012), 11-12.  She also contends that most theorizations of these systems are too abstract to capture their emotional consequences for individuals.

[v] Lutz, 64.  Of course, when women don’t cry in situations where they apparently should, they are regarded as unfeeling at best, suspect at worst.  For example, many people have noted that Mariane Pearl (the widow of Daniel Pearl, a journalist who was beheaded by Pakistani militants in early 2002) does not cry in public.  The filmic adaptation of her story, A Mighty Heart, reflects this.

[vi] Elisabeth R. Anker, Orgies of Feeling: Melodrama and the Politics of Freedom (Durham: Duke University Press, 2014), 3.

[vii] Anker, 31.

 

 

what i learned about writing by not

All is not lost.  What I have lacked in tangible productivity over my long season of writer’s block (which seems finally to be limping its way to a close), I have gained in new understandings of the intricacies of my writing process and the fussy mechanics of getting words on the page.

When you aren’t getting words on the page, it’s crazy annoying (at best) to hear about people that are.  And it’s similarly unpleasant to receive unsolicited suggestions about how to get yourself unstuck.  As if it was simply a matter of will or ergonomics or mental hygiene.  But if it was that easy, anyone could do it.  Producing good work, and doing it well, takes more than that.  So here are a few things I figured out about being productive when I was struggling to produce anything at all.  It’s an open letter, of sorts, to my writerly self – the “I” is me, and so is the “you.”  But the “you” can also be, you know, you, if you are reading this and wanting to reconsider your writing praxis.

vermeer
Maybe writing would be easier if I wore more ermine /// Johannes Vermeer, “A Lady Writing” (oil on canvas, c. 1665)

Become attuned to your limits.
It’s hard to tune out the constant drone of academic meta-commentary about how much (or, from the occasional maverick, how little) we work.  And it helps to know that most of those aggrandizing self-reports are bullshit.  But even still, focusing too much on what other people are doing, or not, just leaves me insecure, or anxious, or envious.  So spend less time worrying about what other people are doing and focus on your own patterns. Then figure out how you work, and be honest about whether all the hours you spend “working” are actually that.  For example, I’ve figured out that I’m neither efficient nor terribly lucid after dinner, and that even when I go back to work late in the evening, I’m not getting much done besides maybe assuaging my guilt about not working enough.

Diminishing returns are a thing.  So consider whether you might be better served by reinvesting those mediocre or largely symbolic work hours elsewhere.

Figure out how you want the experience of writing to feel.  
Turns out, there are no extra points for suffering.  Or if they are, they circulate in an economy that is wildly unrewarding.  Like the counters where you redeem your tickets at arcades: a small fortune in tokens and hours spent playing Skeeball leave you  with an armload of little cardboard rectangles and the teenager in charge of the whole operation barely acknowledges you when you come to select your prize and it ends up that all you can afford is a pencil case.  Anyway.

Few of us have the luxury, presumably, to only write when it feels good.  Deadlines, tenure, promotion, &c.  But unless you produce your best work in the throes of abject misery, experiment with the novel practice of setting your writing aside when writing feels terrible.  We all have different thresholds for ‘terrible,’ and that terrible feeling might be mental or physical, but when you encounter that threshold, I think it’s smart to heed it. Admittedly, I am still relatively new to the routine of being a peer-reviewer, but I have not yet encountered a reviewer questionnaire instructing me to give special consideration to a project if I think the author cried a lot (A LOT) while they composed it.  And if there are people who will give you extra credit for your anguish, think carefully about whether you want to play by that set of rules.

Spend some time thinking about how it feels when you are doing your best work.  Maybe you feel focused, or excited, or peaceful, or maybe you’re so in it that you don’t feel anything at all.  Take advantage of those times, figure out how to increase their frequency if possible, develop strategies for doing good-enough work in circumstances that only approximate them.  And otherwise: leave it alone.

Work at a pace that’s sustainable.
Pretty much every academic I know, including me, is overcommitted.  There are lots of reasons for this, both individual and structural.  Obviously, everybody will define “overcommitted” in their own ways, and experience being overcommitted idiosyncratically.  I’ll need to figure out, eventually, why I have a tendency to hoard projects, but here’s what I know for now: I tend to overestimate the amount of time that I have before a deadline, while underestimating how much work I will want to put into a given project.  Part of me also imagines that the asteroid will surely hit between now and whatever deadline so it won’t actually matter.

I can manage the consequences of my over- and underestimating (as well as the general paucity of asteroids) fairly well under normal circumstances.  But when shit, inevitably happens, that mismatch becomes acutely untenable.

So: try plan out your projects and commitments, as best as you are able, so that they align with how busy you want to be, and when, while also maintaining an overall mode of existence that is tolerable.  (Parenthetically, I think academics ought to aspire to existences that are more than tolerable, and break the habit of postponing tolerability until the summer.)  Not all of this is in your control, of course, so another part of writing and working well is, I think, accepting that those plans won’t always pan out.  And leave a margin for catastrophes, great and small.  If your whole writing scheme is contingent on you never getting a flat tire / your kid never getting sick / you never getting called for jury duty / no one you love ever needing you or dying, it probably isn’t going to work for you long-term.

Consider what it’s worth to you.
Because we are all, alas, constrained by the laws of time and space, doing one thing generally means not doing another (or half-doing two things at once).  Try to be cognizant of the trade-offs your writing affords and requires of you.  Be honest about whether the potential rewards actually appeal to you, and your values.  And then consider the costs, and whether they’re acceptable.  With a few exceptions, I am generally fine to sacrifice binge-watching for writing.  And sometimes I feel very okay opting out of being social so I can stay in and work.  But on the other hand, it’s almost never worth it to me – though it used to be – to trade work for sleep, or healthy food, or exercise.  Maybe your non-negotiable stuff is different.  The point is to figure out what that non-negotiable stuff is, and protect it … otherwise work will eat it all.

Detach from the outcome.
Beyond doing your best to make your ideas intelligible and your style engaging, you can’t control how people will respond to your writing.  Consider your audience, but don’t obsess about them, and learn the difference between wanting to connect with your readers and needing to charm and trap them into your ways of seeing and thinking.  Efforts to engineer reader reactions almost never generate better writing, and are much more likely to result in arguments that overreach or result to pedantry, while the fixation with impressing your audiences will ultimately leave you stultified and unable to say much of anything at all.  Good ideas are much easier to come by than magic words.

Look, and move, forward. 
You will have seasons when you are more productive, seasons when you are less productive, and seasons when you are scarcely functional.  Hopefully, over the course of your writing life, these will balance out into an overall sense of accomplishment, with a body of work that bears it out.  When you are more productive, spend some time figuring out what enables you to work at that level, but don’t make yourself crazy trying to recreate it every time you encounter a slump.  Chances are, it’s mostly a matter of circumstance: a legitimate manifestation of your brilliance, sure, but maybe also just good luck. Conversely, the seasons when you are less productive are also likely to those in which your luck is worse than usual, and not a  final revelation of your incompetence.

Capitalism tells us that time is modular, that any hour has potentially the same value as any other hour, and hence that missed hours can be replaced.  Nope.  If there is something big that keeps you from your work for a season, you won’t (sorry) be able to get those hours back.  And especially if that something big is also something massively unpleasant, you probably won’t be able to stop feeling lousy about those lost hours, anxious or mournful about the work you could be doing, and resentful of the people around you who happen to be enjoying one of those good-luck seasons of magical writing.  In those moments, all you can do is muddle through: do what you can with your radically reduced resources, plead for deadline clemency if you need it, and accept – your overwhelming fatigue may help lubricate this process – that you probably won’t be producing your very best work at this particular godawful juncture.  And don’t compound the insult by blaming yourself for those lost hours, those words left unwritten.  For my part, now that I’m halfway (give or take) back in the saddle after a pretty unrelentingly miserably eighteen months, it’s a daily struggle not to take the losses of that period out on myself.  It takes a lot of mental discipline to focus on what you can do, not on what you didn’t because you couldn’t.

*    *    *    *    *

So that’s a little bit of what I know now that I didn’t know before.  It strikes me as odd that academics, generally so good at questioning why things are the way they are, rarely bring their skeptical sensibilities to the task of questioning their own work habits or the expectations they have internalized.  And for those who are satisfied with their circumstances, there may be no need for this kind of querying.  But I get the impression (or maybe I just run with an exceptionally grumpy crowd) that lots of us are less than satisfied.  Of course, many of the reasons for that are structural, and so insuperable by these tiny little hacks.  But despite this, or maybe because of it, minor adjustments made in the service of your own comfort are meaningful, worth it, and necessary.

 

 

i heart snow days (even on sabbatical).

… just waiting for it to start snowing.  Already, UMBC has cancelled its first day of classes; no matter for me, really, as I am on sabbatical, and already every day feels a little bit like a snow day: lucky, open, forgiving, tinged in the evenings with that little bit of dread at the thought of a return to normal.  But it’s not technically a sabbatical until everyone else has to show up for things that I don’t, so I guess that means Monday for me will be a combination sabbatical snow day, which still feels like something even if the fact of the sabbatical diminishes the reward of the snow day.

Not long ago I was chatting with a colleague who said he found snow days incredibly frustrating, that he resented their interruption of his classes and the rhythm of the semester.  If his students have assignments due on a day when classes are cancelled, he still expects them to submit the work electronically and on time.  For future snow days, he says, he is thinking about experimenting with ways to convene class online.  I admired his dedication, but something about the conversation made me sad.

Sad in the same way that I feel sad when I hear academics say they never take days off or humblebrag about their 80-hour workweeks, the same way I feel when I get emails timestamped from the very wee hours or on weekend evenings.  In those instances, it’s a sadness overlaid with writing guilt, which spills into annoyance, which gets tangled with compassion (which I suppose allows me to offset the writing guilt by feeling superior) for them, for their families, their pets, their bodies, their friends.  It’s a rich text.

Anyway.

The snow day is an assertion that there is something bigger than my priorities, my ego, my expectations.  To be an adult is to be reminded, daily, that the world is almost entirely unconcerned with these things, and most of those reminders are unpleasant at a minimum.  But the snow day delivers a reminder in a different, gentler form: it takes the shape of a reprieve (of course, I have the supreme luxuries of a job that will pay me anyway, and a house that will keep me warm so I can simply take it.)

None of this is to say I don’t work on snow days.  I do, but less than usual and with a sense of relief, gratitude even.  And sometimes that little perceptual shift is enough to make the day feel really different.  If possible, I try to reapportion snow days for my own work, reading and thinking and writing.

But beyond the pleasure of a break from grading and class prep, I think there is a pedagogical value for students in snow days as well.  Doubtless, administrators would prefer that we find some creative way to teach despite the circumstances, but beyond communicating with my students about how we’ll adapt to the change in plans, I don’t.  I let them have their snow day.

More than anything, I want to cultivate curiosity in my students, and curiosity requires an openness to the world, a divestment of their expectations, a relinquishment of their position at the center of the universe, an awareness and a willingness to be surprised, and sometimes even derailed.  Snow days, I think, reinforce these lessons.

Surely, the routinization of crisis under neoliberalism can have a similar effect, and some of my students probably don’t need much education about precarity, because they live it.  All the more reason, then, to give them the day off.

 

 

a holiday from expectations

IMG_1023
The last picture I took in 2015.

For some reason, this time around, even though my hatred of the holidays in general finds its most visceral expression in my hatred of New Year’s Eve, this year, I am experiencing the new year as something.  Maybe it’s the relief, admittedly arbitrary, of having an extra shitty 2015 in the rearview.  Maybe it’s my list of resolutions, which so far seem easy to keep because I haven’t had to leave the house much since the year began, and their potential to orient me toward an existence veritably sparkling with peacefulness and productivity.  I don’t know.  Circumstantially, nothing much has changed with the turn of this new calendar page: my list of problems and deadlines hasn’t gotten any shorter, nor have my stores of patience or aptitude increased.  But still, it feels like something.

There are lots of social, cultural, and economic reasons why people dislike New Year’s Eve (the internet tells me), and they all make sense: pressure, generalized FOMO, and the cost or difficulty of realizing most of our desires for the evening.  And these are plausible explanations for the drinking and the fireworks.

For my part, I’ve been thinking a lot about gifts during this go-round of the holidays, the intricacies of exchange, presentation, and expectation that accompany them.  Lots of people (again, according to the internet) experience these rituals as burdensome, both socially and financially.  So do I.  Surely they persist because of massive, globalized, and finely-tuned apparatuses of production and consumption.  But the season that extends from Thanksgiving to just after Christmas is, symbolically at least, perhaps the most orderly part of the year.   The instructions and expectations, impossible though they might be, are crystalline.  And in most instances, the shortest and most direct route to fulfilling them is consumption.  All of us who observe the Christmas holiday know what we ought to be doing, which is comforting, albeit in a way that’s prickly and anxiety-provoking.

Marking the new year is different.  The only real uniformity and guidance comes from the glittery synchronicity of televised celebrations. And as a holiday, it’s relatively unique for being untethered from rituals of gift-giving.  It’s cheaper that way, but also, I suspect, discomfiting.  The arrival of the new year is an empty signifier.  Hence, I think, the drinking and fireworks.  Insubstantial observances for a meaningless transition.

But still, this year, it feels like something.  I’m finding pleasure, of a sort, in the emptiness of the signifier.  It feels, in this brief stretch where so much seems to be on pause and very little is mandated, like a relief.