On Emotions and Overthinking in Academia

@angry_prof | 25/10/16


I distinctly remember having one particularly confusing week in grad school in 2001. I was funded, published, and on track to complete my dissertation by age 27. But for some reason, that was the week I chose to lie extensively to my university, advisor, and family about having meningitis and spent the entire week on my sofa bed watching Maury Povich. No, this wasn’t the gut-punch anxiety of intentionally emailing the wrong attachment because my comprehensive exam was not finished on time, or the total emotional collapse after my significant other moved away. This didn’t make sense.

As a professor and professional overthinker, I’ve grown accustomed to confused looks when I explain a train of thought or how I make decisions; disquieting looks of incredulity mixed with sadness and a regrettable inability to empathize. Faces both impressed by the sheer volume of overlaid cognition and clearly appreciative of not having to live inside of it. And I’m fully aware that I produce similarly conflicted microexpressions when I hear “I love what I do” reflecting both a distain for flowery emotional language and a deep-seated envy of being able to suspend disbelief about the academic system long enough to develop feelings for it.

So I suppose it’s really not that surprising that there exist remarkably few people with the intestinal fortitude to tolerate my apparent inability to bask in the projected Hunger Games glory of tenure, persistent use of exile as a metaphor for sabbatical, and rehearsed disillusionment of academia as a dystopian, publisher-owned, ego-fuelled Matrix. I get that I’m not the most optimistic person, and that I should presumably have already gotten used to the interpersonal disconnect and ambivalent isolation afforded by an academia-trained propensity for overthought.

But maybe it’s FOMOOE – fear of missing out on overthinking everything – that kills the idea of optimism before it infects. Or maybe it’s my life-long membership to the cult of the next, that ever-lengthening pursuit of the perfect title, institution, journal, award, or mention by one’s academic hero – that pinhole of guiding light that will one day transform into a glorious beacon announcing one’s prophetic insight, intellectual ferocity, or near-death pursuit of knowledge to the world. That imagined validating end point making all the nights, compromises, and forgone personal life experiences worthwhile.

Or maybe it’s just me. Maybe it’s that academics like me tend to self-select into this heady ego system, tolerating a culture of intellectual prize-fighting at the expense of overworking the eager in order for those occasional strokes of ego to feel that much more self-soothing. That heart flutter of excitement when opening a conference notification email. That profound swelling of pride when seeing your name and affiliation formatted in columns in your publication PDF. That feeling of royalty when stepping off a plane in a foreign land to address to an adoringly naive, intellectually starved audience satisfied only by the acute physical apperception of soul-quenching speculation leaving your lips one syllable at a time.

I don’t know. Sometimes I think my experience in academia would be easier if I could better ignore how the intellectual stimulation of discovery or pride of publication doesn’t quite mask the loneliness of being the only one who understands what you do at your institution, or drinking alone at a hotel bar because everyone else at the conference was meeting up with old colleagues. I sometimes wonder if imposter syndrome is specific enough a label to cover feeling out of place not because of skills or reputation, but by having too many feelings or thinking too much about them. I also often wonder if my colleagues are really my friends, or if we’re just the only ones consistently left behind as students continually move on to more interesting developmental milestones and career challenges.

But what tends to bug me the most is that I can’t decide whether I think too much, feel too much, or both; whether I’m overthinking my feelings, or getting too emotional about the way I think. And then there’s trying to figure out if all this thinking and feeling is typical, if I am alone in wondering why all of this seems so confusing. Whether spending a week in bed means I’ve developed a remarkably sophisticated premature disillusionment with the publishing oligarchy dominating academic politics, or if I might just be depressed because I’m alone as would a normal person. It’s a confusing process trying to decide if being a good academic means harnessing all emotions toward the good of science, or alternatively, if having feelings that get in the way of writing means I’ve chosen the wrong profession.

The hypothesis that this extent of deliberation over my emotions makes me special is not supported by immediate responses to sarcastic attention flares on Twitter. However, it is readily debunked by body language from colleagues that very clearly tells me to stop talking because you’re making everyone uncomfortable. It’s not easy bringing up feeling confused, disillusioned, sad, lonely, or depressed in academic circles without worrying about how it will impact departmental politics or your professional reputation. And I’m not saying I’m particularly adept at expressing these sentiments or admitting when I need help, but I have learned a few things since grad school.

First, I am not alone. I have learned to recognize a familiar pain in the eyes of students, post-docs, and fellow faculty when I talk about the struggle to maintain self-care or personal relationships in the face of teaching demands or the pressure to always be writing. I now notice the quiet nods from colleagues when intimating through a change in tone or well-timed silence how truly lonely it can be to live inside your head for a living. And just as I’ve tried to create a safe space for students to yell or cry over illness, disability, loss, discrimination, finances, family, or even a manuscript rejection, I’ve also seen full professors completely break down when things were too much.

Second, saying these things out loud takes practice. Yes it does feel exceptionally weird and like an explicit admission of weakness or collective betrayal to admit doubting yourself, regretting academic career decisions, or acknowledging that your love for what you do may not be strong enough to compensate for its emotional toll. But there are few things like hearing yourself say the words “I don’t enjoy this any more” or “I think I’m just really lonely” out loud to kickstart your academic propensity to problem solve or to stumble across someone you actually believe when they say “I hear you” or “it will be ok”.

Finally, I’ve learned that although I may as an academic be able to convince myself that my emotions are too complicated or specialized for colleagues, friends, family, or the general public to appreciate, this is complete bullshit. Arguably the most reliable consequence of assuming that my feelings were not understandable by others because they concerned impact factors, letters to editors, intradisciplinary norms, training doctoral students, or teaching/evaluating higher-order cognition was that I was left feeling even more alone than before.

In my experience, academics are not a special breed immune to basic emotions, but instead uniquely equipped to paint ourselves into a corner of isolation by convincing ourselves that our experiences are qualitatively unique as evidenced by others not understanding what we say or do. Feeling embarrassed of not being able to keep a promise to yourself is not unique. Feeling shame when facing unmistakable consequences of choosing your career over your family does not make you special. Wondering if you’ll ever achieve a level of success where you won’t feel like an imposter is so common they’ve had a label for it since like the 70’s.

If admitting you have these feelings is the first step to feeling less alone, the next step is probably swallowing your pride and putting it as simply as possible. Although perhaps not as metacognitively satisfying as “mitigating affective disengagement by way of linguistic transduction and affiliation”, being honest about how you feel might require the humbling realization that although your work might set you apart, your feelings don’t. Whether starting with sarcastic quips on Twitter or a trip to your friendly neighborhood psychologist, there are people who listen if you try to say something.

In an academic world where cognition is currency and publication is king, I understand the academic disinterest toward emotions not involving passion, inspiration, or perseverance that can distract from writing and contributing to science. I’m just saying that pursuing your academic dreams can lead to treating your emotions like an afterthought, and that as overthinkers, we can probably do better.

logo_square

 

 

 

On #PeerRevWk16: An Entirely Cynical Perspective

N. C. Hall  /  12/10/2016


#PeerRevWk16 is an annual effort by academic publishers to bolster flagging peer review participation, quality, and speed through explicit statements of thanks and recognition.

Although this initiative could be viewed as a face-valid effort by a public service industry charged by governments and post-secondary institutions with the sacred, inestimable responsibility of research dissemination, there are major ongoing issues underlying academics’ reluctance concerning peer reviews that this initiative does not discuss. From huge publisher profits afforded by gouging public institutions and not meaningfully compensating academics to unjustifiably high open access fees and peer review patents to stifle competition, there are serious systemic problems underlying the peer review process that this hashtag effort does little to address.

Basically, I started to feel uncomfortable seeing publishers attempt to dominate a hashtag ostensibly “for” academics with tweets containing marketing-department infographics on what academics want, promoting a new reviewer ratings system, or sharing “how-to” guides to cost-effectively improve the quality/speed of free academic labour. In response, it seemed important to balance this profitable status quo narrative by highlighting the uncomfortable realities of the peer review process for academics. I am by no means an expert on higher education policy/ethics/economics, I just wanted to share information and balance the discussion about how to promote research quality by better supporting those who do it.

It all started a few weeks ago when I first noticed tweets from academic publishers pop up in my timeline underscoring the importance and novelty of thanking peer reviewers as well as quantifying/ranking peer review efforts:

In typical fashion, I responded with flippant sarcastic commentary, thinking it to be an obviously transparent (and hopefully temporary) publisher effort to pacify volunteer reviewers with a pat on the back and self-relevant data:

But this weird gratitude parade only seemed to ramping up and it got me thinking more seriously about the motivation behind these reviewer appreciation efforts:

As a good academic, I supplemented these devastating hot takes with references to external sources outlining the growing dissent concerning the publication process:

With such an eviscerating response to this uncomfortable wave of public publisher affection, I thought my job was done. However, I soon realized there was an actual hashtag for this initiative – #PeerRevWk16 – and an entire week to come of publisher efforts to spam Twitter with pre-scheduled, strategic gratitude PR aimed at thanking academics by educating them as to their peer review value and responsibilities.

Some #PeerRevWk16 publisher tweets hoped to inform researchers of the importance of peer reviews as the cornerstone of scientific inquiry, as if they were somehow not addressing individuals who by definition should be not only intimately familiar with the scientific process but have based their research careers largely on this premise:

Other tweets expressed heartfelt thanks to reviewers for their time and effort through mass cut-and-paste “publishers are people too” gestures garnering remarkably few RTs or replies:

Publisher spam also included regularly scheduled marketing-office infographic blasts educating academics about why they do (read “should do”) peer reviews, with most results ironically showing academics to have already decided on better ways to spend their time:

And then there were the tweets consistently promoting the new reviewer recognition system “Publons”; a publisher-owned effort to bolster peer reviewer commitment by tracking, quantifying, and ranking peer reviewers:

But perhaps the most condescending #PeerRevWk16 tweets were those gently informing academics as to how they could better perform their free publisher labour:

So I admittedly got a bit snarky:

And being on sabbatical, words were soon diverted from manuscript revisions to countering this increasingly awkward, oblivious, and patronizing publisher narrative implying that problematic peer review disengagement could be remedied not by meaningful compensation or real talk about peer review costs, but by a Twitter campaign aimed at educating, flattering, and shaming academics. Again, I’m not an expert on the academic publishing industry, but it seemed important to share some thoughts on issues that were clearly being avoided such as:

1.  The peer review burden on vulnerable academics:

2.  The ethics of peer review compensation:

3.  In-store credit as review compensation:

4.  Financial compensation for peer reviews:

5.  The exclusion of industry expertise:

6.  Peer review sampling bias:

7.  The “gamification” of peer review:

8.  My personal review perspective:

9.  Public perception of publisher appreciation efforts:

So while the #PeerRevWk16 initiative does on the surface present as an effort to simply thank and support peer reviewers, a quick consideration of the academic publishing landscape suggests that it may also represent an effort to whitewash growing public discontent over a massively profitable industry that does shamefully little to show respect for the free academic labour on which it relies:

So for good measure, I doubled down with @AcademicsSay to better punctuate the #PeerRevWk16 publisher noise:

Even Nature got in on the fun:

And despite publisher-provided highlight reels of #PeerRevWk16 in which most of the above is effectively excluded, the narrative that resonated most with academics was obvious:

As to where to go from here, there were a few thoughts:

Maybe it’s just me, but this hashtag effort at best seems intended to distract from publisher problems or promote new publisher products. At worst, it seems a fundamentally misguided attempt to sustain profits by increasing peer review engagement among (a) inexperienced, less expert academics not yet familiar with the scientific process, (b) early career researchers trying any way they can to demonstrate a willingness to sacrifice their time and energy to potential employers, or (c) already overburdened academics disillusioned with the publication process who need and will take the self-esteem boost despite its patronizing tone.

Is a thank you from publishers for peer reviewing appreciated? Perhaps, but that’s not why we do it. And as a transparent attempt to placate a base increasingly dissatisfied with publishers profiting from their good will, institutional/disciplinary pressure, and passion for science, the #PeerRevWk16 effort kinda looks like using the “tip” section of a bill to provide actual tips on how to serve publishers better:

Of course, I might be entirely off-base in interpreting #PeerRevWk16 as anything other than a face-valid attempt to show some much-needed appreciation to hardworking volunteers. But as a leading authority on pandering to academics on Twitter, I can safely say that academic publisher trolling could use some work.

logo_square

What I Learned About Writing by Not

Author: Rebecca A. Adelman
Original: www.rebeccaaadelman.com


All is not lost.  What I have lacked in tangible productivity over my long season of writer’s block (which seems finally to be limping its way to a close), I have gained in new understandings of the intricacies of my writing process and the fussy mechanics of getting words on the page.

When you aren’t getting words on the page, it’s crazy annoying (at best) to hear about people that are.  And it’s similarly unpleasant to receive unsolicited suggestions about how to get yourself unstuck.  As if it was simply a matter of will or ergonomics or mental hygiene.  But if it was that easy, anyone could do it.  Producing good work, and doing it well, takes more than that.  So here are a few things I figured out about being productive when I was struggling to produce anything at all.  It’s an open letter, of sorts, to my writerly self – the “I” is me, and so is the “you.”  But the “you” can also be, you know, you, if you are reading this and wanting to reconsider your writing praxis.

Become attuned to your limits.
It’s hard to tune out the constant drone of academic meta-commentary about how much (or, from the occasional maverick, how little) we work.  And it helps to know that most of those aggrandizing self-reports are bullshit.  But even still, focusing too much on what other people are doing, or not, just leaves me insecure, or anxious, or envious.  So spend less time worrying about what other people are doing and focus on your own patterns. Then figure out how you work, and be honest about whether all the hours you spend “working” are actually that.  For example, I’ve figured out that I’m neither efficient nor terribly lucid after dinner, and that even when I go back to work late in the evening, I’m not getting much done besides maybe assuaging my guilt about not working enough.

Diminishing returns are a thing.  So consider whether you might be better served by reinvesting those mediocre or largely symbolic work hours elsewhere.

Figure out how you want the experience of writing to feel.  
Turns out, there are no extra points for suffering.  Or if they are, they circulate in an economy that is wildly unrewarding.  Like the counters where you redeem your tickets at arcades: a small fortune in tokens and hours spent playing Skeeball leave you with an armload of little cardboard rectangles and the teenager in charge of the whole operation barely acknowledges you when you come to select your prize and it ends up that all you can afford is a pencil case.  Anyway.

Few of us have the luxury, presumably, to only write when it feels good.  Deadlines, tenure, promotion, &c.  But unless you produce your best work in the throes of abject misery, experiment with the novel practice of setting your writing aside when writing feels terrible.  We all have different thresholds for ‘terrible,’ and that terrible feeling might be mental or physical, but when you encounter that threshold, I think it’s smart to heed it. Admittedly, I am still relatively new to the routine of being a peer-reviewer, but I have not yet encountered a reviewer questionnaire instructing me to give special consideration to a project if I think the author cried a lot (A LOT) while they composed it.  And if there are people who will give you extra credit for your anguish, think carefully about whether you want to play by that set of rules.

Spend some time thinking about how it feels when you are doing your best work.  Maybe you feel focused, or excited, or peaceful, or maybe you’re so in it that you don’t feel anything at all.  Take advantage of those times, figure out how to increase their frequency if possible, develop strategies for doing good-enough work in circumstances that only approximate them.  And otherwise: leave it alone.

Work at a pace that’s sustainable.
Pretty much every academic I know, including me, is overcommitted.  There are lots of reasons for this, both individual and structural.  Obviously, everybody will define “overcommitted” in their own ways, and experience being overcommitted idiosyncratically.  I’ll need to figure out, eventually, why I have a tendency to hoard projects, but here’s what I know for now: I tend to overestimate the amount of time that I have before a deadline, while underestimating how much work I will want to put into a given project.  Part of me also imagines that the asteroid will surely hit between now and whatever deadline so it won’t actually matter.

I can manage the consequences of my over- and underestimating (as well as the general paucity of asteroids) fairly well under normal circumstances.  But when shit, inevitably happens, that mismatch becomes acutely untenable.

So: try to plan out your projects and commitments, as best as you are able, so that they align with how busy you want to be, and when, while also maintaining an overall mode of existence that is tolerable.  (Parenthetically, I think academics ought to aspire to existences that are more than tolerable, and break the habit of postponing tolerability until the summer.)  Not all of this is in your control, of course, so another part of writing and working well is, I think, accepting that those plans won’t always pan out.  And leave a margin for catastrophes, great and small.  If your whole writing scheme is contingent on you never getting a flat tire / your kid never getting sick / you never getting called for jury duty / no one you love ever needing you or dying, it probably isn’t going to work for you long-term.

Consider what it’s worth to you.
Because we are all, alas, constrained by the laws of time and space, doing one thing generally means not doing another (or half-doing two things at once).  Try to be cognizant of the trade-offs your writing affords and requires of you.  Be honest about whether the potential rewards actually appeal to you, and your values.  And then consider the costs, and whether they’re acceptable.  With a few exceptions, I am generally fine to sacrifice binge-watching for writing.  And sometimes I feel very okay opting out of being social so I can stay in and work.  But on the other hand, it’s almost never worth it to me – though it used to be – to trade work for sleep, or healthy food, or exercise.  Maybe your non-negotiable stuff is different.  The point is to figure out what that non-negotiable stuff is, and protect it … otherwise work will eat it all.

Detach from the outcome.
Beyond doing your best to make your ideas intelligible and your style engaging, you can’t control how people will respond to your writing.  Consider your audience, but don’t obsess about them, and learn the difference between wanting to connect with your readers and needing to charm and trap them into your ways of seeing and thinking.  Efforts to engineer reader reactions almost never generate better writing, and are much more likely to result in arguments that overreach or result to pedantry, while the fixation with impressing your audiences will ultimately leave you stultified and unable to say much of anything at all.  Good ideas are much easier to come by than magic words.

Look, and move, forward. 
You will have seasons when you are more productive, seasons when you are less productive, and seasons when you are scarcely functional.  Hopefully, over the course of your writing life, these will balance out into an overall sense of accomplishment, with a body of work that bears it out.  When you are more productive, spend some time figuring out what enables you to work at that level, but don’t make yourself crazy trying to recreate it every time you encounter a slump.  Chances are, it’s mostly a matter of circumstance: a legitimate manifestation of your brilliance, sure, but maybe also just good luck. Conversely, the seasons when you are less productive are also likely to be those in which your luck is worse than usual, and not a  final revelation of your incompetence.

Capitalism tells us that time is modular, that any hour has potentially the same value as any other hour, and hence that missed hours can be replaced.  Nope.  If there is something big that keeps you from your work for a season, you won’t (sorry) be able to get those hours back.  And especially if that something big is also something massively unpleasant, you probably won’t be able to stop feeling lousy about those lost hours, anxious or mournful about the work you could be doing, and resentful of the people around you who happen to be enjoying one of those good-luck seasons of magical writing.  In those moments, all you can do is muddle through: do what you can with your radically reduced resources, plead for deadline clemency if you need it, and accept – your overwhelming fatigue may help lubricate this process – that you probably won’t be producing your very best work at this particular godawful juncture.  And don’t compound the insult by blaming yourself for those lost hours, those words left unwritten.  For my part, now that I’m halfway (give or take) back in the saddle after a pretty unrelentingly miserably eighteen months, it’s a daily struggle not to take the losses of that period out on myself.  It takes a lot of mental discipline to focus on what you can do, not on what you didn’t because you couldn’t.

*    *    *    *    *

So that’s a little bit of what I know now that I didn’t know before.  It strikes me as odd that academics, generally so good at questioning why things are the way they are, rarely bring their skeptical sensibilities to the task of questioning their own work habits or the expectations they have internalized.  And for those who are satisfied with their circumstances, there may be no need for this kind of querying.  But I get the impression (or maybe I just run with an exceptionally grumpy crowd) that lots of us are less than satisfied.  Of course, many of the reasons for that are structural, and so insuperable by these tiny little hacks.  But despite this, or maybe because of it, minor adjustments made in the service of your own comfort are meaningful, worth it, and necessary.

logo_square

Unpacking @AcademicsSay: Part 1

N. C. Hall  /  06/05/2016


This is my first blog post.

And the only reason you’re seeing it is @AcademicsSay, ostensibly one of the most influential academic social media accounts reaching upwards of 24 million views a month across platforms.

Although polite company warrants eyes-down, humblebrag explanations of the success of this social experiment as serendipitous, that’s not entirely accurate. Instead, the account growth has been markedly consistent, largely anticipated, and intentionally facilitated by strategies common to influential accounts.

To the extent the following may read as a self-indulgent, overthinking, faux-Machiavellian hyper-justification of writing procrastination, I apologize in advance. Below is Part 1 of a tl;dr overview of the varied growth hacking strategies derived mainly from observation, basic psychology, and trial-and-error that may or may not have contributed to the success of @AcademicsSay.


1.   Opportunity. When I set up my professional Twitter account in May 2013, there was no common gathering point for faculty or lightning rod for feedback/sharing. There were no clear accounts to follow first, nothing central that really got academics excited. I wanted to create that, first because it’s confusing and boring to go online and not have a place to connect with others. Second, I was feeling burnt out and needed a laugh. There were also no humour accounts for faculty, aside from scattered student-shaming efforts and @PhDComics for grad students, so I made one. I am not a humour writer. But you don’t need to be great when there’s no competition; you just need to show up.

2.   Tone. I am not a generally positive person. So when deciding how to sound online, I went with my regularly scheduled deadpan, sarcastic, depressing, uncomfortably self-aware over-explanations that make for awkward conversation. I also pride myself on avoiding the wrath of colleagues by getting a laugh despite my interrupting their work as a way of procrastinating on mine. So the overall tone of @AcademicsSay was basically an extension of what I was already doing, just in a more distilled online format. I then found a recognizable meme that fit the tone and went from there. Fortunately, as non-intellectual or unintentionally humourous aspects of academic content tend to get the most attention on social media (e.g., the “Gabor” effect), I was immediately in business.

3.   Authority. I regularly get comments, questions, and surprisingly impassioned critiques about the account behavior; hopefully this section addresses some of that. In addition to content tone, I incorporated from the outset a set of implicit cues to convey authority to potential followers and expedite follow/retweet decisions. This was for two reasons: first, to provide an ironic take on the stereotypical aloof, egocentric academic persona; second, to mimic the profiles of existing viral parody accounts in the history or science domains. Some examples involving language, formatting, colour, and ratios are below.

4.   Language. The word “shit” in the account name implies irreverence or catharsis and is unexpected in academic timelines, grabbing attention while providing ironic context for otherwise curse-free content. The account handle remained curse-free to accommodate more respectable manual retweets. Similarly, “academics” not “professors” were referred to in the account name to convey faculty responsibilities beyond instruction (e.g., writing, tenure requirements, work-life balance). As the content was to be more “water-cooler gossip” or internal self-talk than in-class “dad jokes,” the less-than-student-centered approach was intentional.

5.   Formatting. Tweet text was formatted to exclude “all caps,” emoticons, exclamation points, and question marks to mitigate impressions of attention-seeking and uncertainty. In addition to facilitating a deadpan or aloof tone, ending sentences with periods was also a bit of a inside academic joke not unlike how Kanye West describes the private hilarity of not smiling. To not dissuade engagement among academics who are typically less than familiar with Twitter protocols, I also initially tried to avoid including nonintuitive hashtags (e.g., #ecrchat) and acronyms (e.g., H/T) in favour of more accessible terminology (e.g., via, courtesy of).

6.   Colour. The colour profile was also intentional. Although the specific profile image (“avi”) was selected almost at random from my cell phone, it needed to satisfy two  conditions: it had to show well at lower resolutions and needed to be red. The colour red was emphasized based on research showing red to implicitly convey competitive success and dominance in affiliative and advertising contexts (e.g., CNN, Time, Science, Netflix, BuzzFeed, TMZ, TED Talks) and to solicit more online engagement (e.g., link clicks) than other colours. The image itself is simply a cropped photo of a graffiti art gorilla I took on the sidewalk after a disappointing trip to the farmer’s market. I’d like to think the gorilla signified other elements (e.g., stoicism, “300-lb gorilla” metaphor), but it’s mainly just red.

7.   Ratios. The account also manipulated three Twitter ratios to implicitly convey authority. First, an exaggerated “following-to-follower” ratio was achieved by not following other accounts (as per other parody accounts) requiring unidirectional follows vs. reciprocal “followbacks.” Second, the “retweet-to-follower” ratio was bolstered by deleting tweets that did not sufficiently resonate; a ratio consistently held to around 0.001. For example, tweets in Spring of 2014 (~30K followers) that did not reach 30 retweets were omitted (typically within an hour), with the exception of tweets including links or promoting content intended for “clickthroughs” (current cut-off is ~150 retweets <1 hour, >1K likes on FB; see @TheTweetOfGod, @SoVeryBritish for comparable ratios). Third, deleting tweets with insufficient retweets helped to improve the “tweet-to-follower” ratio. Off-brand tweets promoting specific accounts, lists, hashtags, sites, etc. were similarly omitted to provide an on-brand, content-focussed read for timeline scrollers (“grooming”). Overall, these ratios were maximized to create the impression of an authoritative, non-reciprocal, content-provider account where each tweet not only resonated but gained substantial followers.

8.   Branding. Similar to other viral parody accounts, @AcademicsSay does not reply or retweet. Instead, standalone text reposted from other accounts is formatted as per a typical academic quotation (“…” – @source), or (more rarely) as a screenshot image, to visually associate or “rebrand” it with the account name and image. The quotation format is immediately recognizable to academics but differs from typical (less visually appealing) manual retweets in which acronyms and the original account are inserted before tweet content (e.g., RT “@source …”). This form of attribution is generally appreciated by those referenced, avoids “Twitter plagiarism,” and facilitates portability across platforms (e.g., Facebook, Tumblr). However, it can also be seen as particularly distasteful (especially screenshots) as it effectively affords self-promotion and metric gains at the expense of direct engagement with source accounts. Given the markedly ego-involving nature of not following someone on Twitter or Facebook, it’s perhaps not surprising that this strategy has to date been the most negatively received.

9.   Images. One of the most well-known and easily implemented ways of increasing Facebook or Twitter engagement is to just add an image (e.g., by 35%). So after waiting three months to ensure that text-based content was resonating with followers (~7K), relevant images were introduced. At this point, I had decided to use the account to recruit for off-line research and consciously opted to forego whatever old-guard, intellectual cache was attached to excluslively sardonic text in favor of incorporating more accessible, existing visual content that elicited a more visceral response (e.g., May 2014: doubling new followers/day to 450+ by doubling down on comics, graphics, and screenshots). Given a long-standing body of work by academic comic legends (e.g., PhD Comics, XKCD) and creative efforts of emerging webcomic artists (e.g., Errant Science, RedPen/BlackPen, The Upturned Microscope), finding content wasn’t hard and I finally had a chance to indulge my long-time love of cartoons. I eventually introduced original images and memes to capitalize on social media norms, mocked up preview graphics to increase clicks for news articles or blogs (16:9 to prevent awkward Twitter cropping, better Facebook previews), and started embedding square blog logos that are automatically grabbed when link is shared.

10.   Attribution. Given the emotional and financial investment involved in creating visual content for social media, I eventually started to receive responses from artists requesting that additional source information be included in posts beyond that contained in the image. And after a few requests by original artists (e.g., @MacLtoons, Kemson Cooper), online criticism when attribution was not included (e.g., Paris attack graphic), and an education on attribution and copyright by my friend Jorge Cham (@PhDComics) following an uncomfortable Twitter/email exchange with artist @twisteddoodles, I not only research the origins of posted artwork (e.g., TinEye, Karma Decay, Veracity) but try to provide linkbacks to within-platform accounts or external sites to not deprive artists of potential exposure or income. Although posting images without attribution or linkbacks is more efficient (particularly when source/contact info is embedded), a well-worn strategy for expediting growth (see @HistoryInPics, IFLScience), and not unpermitted in the Twitter TOS (see p. 22, Agence France Presse v. Morel), it is more susceptible to removal on Facebook or Twitter (DCMA takedowns) on copyright grounds and is not a good look for an academic audience uncommonly preoccupied with attribution.

11.   Anonymity. I ran the account anonymously until July 2015 for various reasons. First, I didn’t want my atypical online activities to somehow influence my tenure deliberations. It also helped to maintain a focus on the followers, underscoring the aim of the account to resonate based on shared experiences rather than a self-indulgent showcase of intellectual, writing, or humour abilities. In this way, followers were allowed to perceive their engagement more simply as sharing a laugh or connecting with others by way of satire, as opposed to endorsing the attention-seeking efforts of a specific individual. This decision also helped to circumvent the awkward self-esteem-loaded “followback” expectation otherwise encountered with personal Twitter accounts. In a similar vein, demographic cues involving nationality (e.g., American spelling), gender (typically assumed female), race, rank, or discipline that could unnecessarily complicate or bias content perception and mitigate engagement were avoided. As an anonymous account, I was also allowed more freedom to make mistakes and experiment in term of content (e.g., topics, attribution) or growth strategies (e.g., branding, promotion) without risk of direct criticism or reprisal.

Maybe it’s because academics tend to be familiar with blinded research and manuscript reviews that remarkably few people ever asked who I was. Or maybe it’s that social media platforms generally promote engagement over attribution, a point illustrated by Twitter adding the “quote tweet” function in 2015 while at the same time quietly removing the automatic insertion of quotation marks and account mention (used for manual retweets) when copying tweets in the app (making it much easier to plagiarize). Regardless, it was only after my tenure was confirmed, account influence exceeded relevant benchmarks, the cache of “coming out” could be reliably predicted to bolster off-platform efforts (study recruitment), and these unconventional online activities could be justified in part as a public service to non-social-media users that I wrote the Chronicle piece about the account (as agreed upon one year earlier). However, judging by continued confessions of love for “whoever you are” or “you guys,” and minimal spillover to my personal Twitter account, people generally don’t seem to notice or care who’s running the account.

12.   Efficiency. To promote initial growth, I also pre-prepared tweets that released automatically on apps like Buffer (Facebook pages provide in-platform scheduling) and used free sites like Tweriod to determine optimal tweet times (now largely irrelevant due to international reach). Not unlike other parody or satire accounts, I also regularly repeat content. Although I had previously deleted original tweets to disguise this strategy (some accounts delete tweets wholesale, presumably for the same reason), I now keep them up to gauge growth. I initially felt comfortable repeating only after a 6 month lag (consistent with previous Twitter API restrictions preventing older tweets from being viewed), but now tend to repost within 2-3 months due to a follower base big enough to ensure sufficient sharing from those who would not have seen it, would not remember seeing it, or would not mind seeing it again. Although some repeats are verbatim, others are reformatted or modified (e.g., replacing “book” with “blog” 9 days later) to improve engagement. As for the account meme, the “shit xxx say” format itself affords specific efficiencies, such as a focus on what others say (observation is much easier than inspiration) and basic text (Siri dictation while waiting at Starbucks vs. curated content or creating visuals), as demonstrated by even single-letter posts gaining traction. Finally, one unanticipated consequence of this meme is the extent to which it actually encouraged crowdsourced feedback (replies, mentions, emails) that has to date been highly effective in terms of providing off-platform content, pop culture phrases (e.g., “all of the things”, “Netflix and chill”), timely memes (e.g., Game of Thrones), or even grammatical improvements for repeat posts.


So there you go: a quick introduction to some of the more straightforward strategies adopted a priori or over time to expedite follow decisions and account growth for @AcademicsSay. For more on the roles of analytics, experimentation, and emotions, or more awkward topics such as plagiarism, haters, and monetization, check back for Parts 2 and 3 in the coming days.


logo_square

 

How to Not be Boring on Academic Social Media

Author: @TheLitCritGuy
Original: TheLitCritGuy.com


For many academics it may seem that the rise of social media is yet another means of potential procrastination. Yet increasingly, certain academics have turned to social media not just as a way of accessing entertainment or as a tool for networking but as a means of engaging audiences in a brand new way.

Perhaps the most famous and well-known is @NeinQuarterly, an anonymous account that blends aphorisms, jokes and an expert level knowledge of German literature and culture to produce a fascinating and hugely popular account. Started by a former professor of German literature, @NeinQuarterly’s unique aphoristic and satirical style now appears in print in German and Dutch newspapers and last year saw the publication of Nein: A Manifesto, a book collecting his finest material that’s been published in multiple languages. On YouTube there is aside from John and Hank Green’s famous ‘Crash Course,’ PhilosophyTube, an account started from nothing just a few years ago that now has around 60,000 subscribers following their videos on Masters level philosophy.

Personally, my own anonymous account started for far less career-minded reasons. Having finished my Master’s degree and with a twitter account that I didn’t really use, I decided to dedicate it to talking about the thinkers and ideas that had intrigued me during Masters study and provoked me into applying for a PhD. I decided to cover literary theorists and critics who had been only briefly touched upon during my undergraduate degree. However, after starting the account I was convinced it would be largely ignored yet after tweeting to a few more widely followed accounts it picked up a surprising number of engaged and highly curious followers. Almost immediately, issues such as a posting schedule, what to talk about, and even the limits of my own knowledge became something that had to be dealt with. With a vocal and supportive group of followers I was forced to honest about my own limitations, my own inexperience, and allow myself to discover the liberating freedom of telling followers that I don’t know; that I would love to know more about something (something almost unthinkable in the high pressure environment of PhD research). The pressures of normal life meant that often the account became deeply personal as well as something academic and this seemed to only further the connection between me and the great groups of people who followed the account.

On top of this, anonymity comes with certain benefits that using social media with a name and a face doesn’t carry. From behind the “persona” of TheLitCritGuy my opinions don’t need to be run against what my institution or its managers might deem to be acceptable. Anonymity also allows the freedom for a kind of character to emerge. Behind anonymity, anger at the conditions of higher education for ECRs and students can be expressed more forcefully, and I also get to mash up jokes with theory without worrying colleagues will take me less seriously.

For academics who wish to take to social media and use it in a way beyond networking or sharing cat videos there is no sure fire way of doing things, but in the course of my own experiment there are a few things that I’ve found to have worked.

Firstly, have a distinctive voice. Anonymous accounts do not necessarily have a name or a face, but they depend upon having a distinctive perspective to offer. From Twitter the pseudonymous accounts @EthicistForHire and @CrankyEthicist from the name alone, immediately offers potential followers an insight into their account and what they are like.

Secondly, have a purpose. One of the most successful anonymous accounts in #AcademicTwitter, @AcademicsSay posts collections of jokes that connect really strongly with academics – jokes about coffee, about being overworked and the ever present catchphrase that ‘you should be writing.’ These highly sharable posts always keep the account highly focused and with a clear sense of purpose allowing it to grow to being followed by hundreds of thousands of people.

Thirdly, find your audience. Rather than just post into the void, the best academic accounts use the tools of social media to find an interested audience. Most notably, there are hashtags like #twitterstorians, where historians post and organise their thoughts, allowing an audience who want to engage with historians to find them. I always try and organise my own posting under #TheoryTime, allowing followers to keep up with what I’m talking about and catch up on topics they may have missed.

Fourth, expand. Whilst my own twitter account was successful, I quickly encountered the limitations of the form. I decided to expand my account into a research blog, as well as using the platform I built on twitter to write on new websites, bringing @TheLitCritGuy to a much wider audience.

Finally, connect. Whilst people follow an account or watch a YouTube channel to gain knowledge, using social media allows for academia to become more personally relatable – rather than a hierarchy of a teacher with students, twitter becomes a space of conversation and mutual education. Whilst I try and keep the important details of my life private from my account, a few personal details, personal opinions, and replies to followers makes the account more vibrant, more interesting and much more fun for those following.

It is this that makes anonymous accounts so effective too – outside of the structures, rules and roles of university networking, the anonymous account can become a place where academic researchers get to connect directly with an audience. Impact becomes something more than just a metric as people get to connect with academics beyond the realm of university organised public engagement events. Furthermore, this use of social media allows the public to see what life as an academic can be like, in all of its good and bad points.

Behind the anonymity of a nameless, faceless account I’ve shared some of the struggles of being an early career researcher, news about the state of the wider UK HE environment and the sheer joy of teaching as well as sharing and talking about my own research and intellectual passions. Whilst anonymous accounts bring a certain degree of freedom, there is the pressing awareness that my account won’t necessarily benefit my career within the university system. However, as more academics take to social media, using anonymous accounts allows for a new kind of creative, flexible academic to emerge, more closely linked with the public rather than embedded within the ivory towers of the university system.

I’ve received countless tweets, Facebook messages, and emails from people across the world, who, through various pressures felt they couldn’t pursue their own passion for literature and theory – needing a job, or dealing with their children they feel like they’ve missed out on a swathe of knowledge and it’s a genuine privilege to answer the questions and learn from them. Whether it be emailing economists about Foucault or letting a nursing student know more about phenomenology using social media has shown me that beyond the limits of the university classroom, people are curious and searching for new ways to be engaged and to learn. Social media can change how we teach and spread knowledge beyond the limits of the university and through anonymity academics might well find the freedom to connect with the public like never before.

logo_square

Could Parental Leave Actually be Good for my Academic Career?

Author: David Kent
Original: University Affairs | The Black Hole


Last autumn, I started my research lab at the University of Cambridge’s Stem Cell Institute, but this coming summer I’m doing something completely different – I’m taking parental leave with my first child. I must admit that at least some inspiration came from my brother, who took a term off with his second child and said it was one of the best decisions he’d ever made.

It’s been a tough journey to get a group leader position – 11 years of intense research-focused time, most of which were spent in a complete black hole of uncertainty with respect to my future career. And now, I won’t be in the lab for 14 weeks – we’ll see how it all works out.

Reaction to my decision amongst non-academic family and friends was pretty much universally positive, but reaction from academic colleagues was highly variable – a substantial number of whom think I’m absolutely crazy to take off so much time within the first year of my research lab’s existence. I wasn’t too surprised by this, having emerged from the North American system where parental leave is much less generous than in Europe. What I didn’t expect were the other reactions …

In November, I was at a national cancer conference and at one of the evening receptions I spoke with a female scientist from another U.K. university about women in science. Over the course of the discussion, I mentioned that my partner and I would be taking advantage of the U.K.’s new “Shared Parental Leave” policy, with my partner taking 8.5 months of leave and me taking 3.5 months. She said she was shocked and surprised that a brand new group leader would take the time off, but also said “good for you.”

The next evening is when things really hit home though. After the conference dinner I was on the dance floor and a complete stranger came up to me and asked, “Are you David Kent?” I assumed she had seen my presentation earlier in the day until she continued, “the David Kent who is taking parental leave as a new group leader? I just wanted to say thank you.” We chatted a little and it was as simple as this: a male group leader taking parental leave was just not that common, especially not a 3.5-month block of time. The professor from the other night had clearly gone off and told her colleagues and word had spread.

Here I was being showered with praise for taking 3.5 months off work and feeling pretty good about my decision until I did a quick comparison to my partner’s situation, also an early career scientist. Not only would she be taking nearly three times the amount of leave, but she’s also been carrying a baby around for eight months whilst undertaking world-class research. Is there a small fan club of approving academics lined up to congratulate her on the brave decision to spend time with her child? Not that I’ve seen.

So, in effect, my taking a short block of parental leave has boosted my profile in the eyes of some academics and her taking a longer block will put her in the challenging position that so many young female academics find themselves in: trying to play catch-up and pretend that children haven’t impacted their careers (many do not acknowledge children on CVs, job applications, etc., for fear of being viewed unfavourably). The science community needs to embrace rather than shun such individuals.

Overall, if universities want more women in science, then the way we handle babies and families needs to change – men need to be as “risky” to hire as women. But change does not come overnight and it does not come easy. As a start, more countries (and institutions) need to have “use it or lose it” policies, such as exists in Quebec – the father is given a block of time that the mother cannot use. Universities and individuals need to fight for this. Countries such as Sweden have seen incredible results from such policies and are amongst the world leaders in having women in senior positions. For science specifically, granting agencies need to behave like the European Research Council with respect to eligibility windows and like EMBO for postdoctoral fellowships – creating small allowances for young parents that make the journey just a little bit easier.

Or perhaps we should just force them all out of science – that seems to be the way things are currently set up and it makes me worry for our future science workforce.

logo_square

It’s OK to Quit Your PhD

Author: Jennifer Polk
Original: From PhD to Life


Occasionally I’m asked about quitting, particularly “quitting” a PhD program. This happened several times last week, when I was in Vancouver.

Contrary to what you may hear or what your own internal critics tell you, there’s no shame in moving on. I remember a long post on a Versatile PhD forum from “PJ,” an ABD thinking about leaving instead of spending another two years (minimum) to finish their PhD. In response, one commenter wrote, “But the real question is, do you want to be a quitter? Now, not everyone will view that question the same, and I’m sure many will say that equating quitting a PhD program to being a quitter is not valid, but in reality, it is.” No! Thankfully, most other commenters on the thread offered more nuanced and helpful reflections and advice. “Finishing is not just about the destination,” one former tenure-track professor pointed out. “If that’s the only thing you want, then it’s a tough few years ahead.” Indeed.

Before you make the decision to leave, separate your inner critic – who may well be reflecting outer critics in your life – from what you know is right for you. Trust your gut, not your gremlin. In my experience, this is a decision that individuals make and re-make over time. I’ve worked with a few clients who’ve contemplated not finishing their PhD programs. While you figure out what you want, it’s ok to be ambivalent, carrying on the work but distancing yourself psychologically and emotionally from academia. What are your goals? Once you know them, you can determine the correct strategy to move toward them. (With thanks to Harvey P. Weingarten’s recent post.)

The “no one likes a quitter” attitude that exists in graduate school and perhaps in academia writ large isn’t warranted. There is nothing inherently good or bad about completing a PhD. It’s only a good move for you if it is a good move for you. While individuals who depart sans degree will come to their own personal conclusions about their decisions, the wide world rarely cares. It’s instructive that in PJ’s original post, they mentioned that their former undergraduate professors were unanimous in advising them to quit. I’ll let English professor (and graduate advisor) Leonard Cassuto speak for ideal advisors everywhere: “Most of my advisees finish their dissertations and get jobs. I’m proud of them. But some walk away – and of that group I’m just as proud” (Graduate School Mess, p. 121). I feel the same way about my own clients, whatever path they choose to take.

A while back Christine Slocum reflected on her career journey in a Transition Q & A post. She’d completed an MA and then two years of a PhD program, then moved on before achieving ABD status. In her post she explains there were several reasons for her choice, including feeling burnt out, lack of community in her department, and desire to start a family. Pursuing the doctorate no longer meshed with her goals: “After some soul searching, I remembered that the reason I was pursuing sociology in the first place was to better understand the mechanisms of social stratification because I wanted to better understand how to undo it. ​Four years of graduate study [later,] I felt like I had enough that the next five years would be better spent working for an NGO, nonprofit, or government position getting practical experience in the field.”

Heather Steel made a similar decision when she decided not to continue her PhD in the midst of dissertating. She learned important information about herself during graduate school. “There were parts of my program that I enjoyed very much (classes, having the chance to read and think, teaching, and my colleagues), but in the end,” she realized, “sitting for hours in front of a microfilm reader to write something that few people would actually read was not fulfilling.” Heather learned that she enjoys “research in small doses, not projects that take years to see results.” When I did an informational interview with her during my transition, I learned that she didn’t regret her choices. Her career has continued to progress since then.

When I was in Vancouver, a graduate student in the audience at one of my talks shared his own story: He’d been enrolled in a PhD programs years before, then left. But here he was back doing another doctorate! He was nearly done, and this time around he knew it was the correct path for him. I know several people who’ve done similar things, for a variety of reasons. Fascinating, eh?

If completing your PhD is the right move for you, carry on. Get support and help wherever you can find it, go part-time, or take a break or leave or absence. Make whatever changes you need to smooth your journey. But if the doctorate no longer makes sense — your goals have changed, you’ve learned more about yourself over the years — then I’ve got your back (in spirit) in deciding not to continue. You’re not “quitting” or “leaving”; instead, you’re embarking on a new, better-for-you path, taking what you learned and experienced and applying it in a context that’s more suitable to who you are, how you work best, and where you want to go. That’s risky and brave, but it’s also just you standing up for yourself. It took me until after my PhD to do that. Feel free to do as I didn’t.

logo_square

Scientists Have the Power to Change the Publishing System

Author: David Kent
Original: University Affairs | The Black Hole


Earlier this month I read an article by Julia Belluz that ripped into the scientific publishing system. The saddest, and truest, sentiment of the article can be summed up in the following quotation:

“Taxpayers fund a lot of the science that gets done, academics peer review it for free, and then journals charge users ludicrous sums of money to view the finished product.”

This is certainly not the first attack against the publishing process nor the first to encourage open-access publishing. In the remainder of her article, Ms. Belluz focuses on the role that governments can play in getting more scientific research freely and instantly available. In sum, she suggests that government funding agencies (e.g., the United States National Institutes of Health or the Canadian Institutes of Health Research) could refuse to give grants to those scientists who did not publish in open-access journals.

This is a laudable, and indeed it is the approach being taken bit by bit by funding agencies – the Wellcome Trust in the U.K. for example has a very robust open access policy that includes providing grant funding for the open-access charges. While this will certainly get more research out sooner and without charge, I believe it misses out on an important aspect of the power dynamic that plagues the scientific publishing process.

The fact is that journals with high impact factors wield enormous power because they hold the key to scientists’ careers – the field has become so obsessed with metrics that it is insufficient to be a good scientist with good ideas and the ability to perform good research. As things stand now, if you want research grants (and in most cases, this means if you want a job), then you need to publish a paper (or several!) with a big-name journal.

So what can scientists do? Well, it turns out scientists are involved in just about every aspect of the publishing power dynamic. First, one needs to understand what’s at stake. Scientists want big name papers for three main reasons:

  1. Grants
  2. Jobs
  3. Recognition

However, papers in big-name journals do not directly give you grants or jobs, nor are they the only way to be recognized as a good scientist. Other scientists make these decisions, but far too often their judgment is impacted by the glitz and glam of the big-name journals.

Jobs are often won by those doing research that has good institutional fit – they bring a novel technology, a new way of looking at things, or a broad network of excellent former colleagues – but jobs are often lost because the candidate is “not fundable.” The latter is more often than not decided based on where they have published and how a grants panel will view them. So it basically comes down to who can get grants. And who generally decides funding outcomes? Scientists.

I wonder how many grant panels have heard the phrase “the project looks good, but the candidate has only ever published in mid-range journals.” Indeed, I know several scientists who rank applications based on a candidate’s publication record irrespective of how good or bad the project is or how well-resourced the working environment is.

One suggestion: Ban the CV from the grant review process. Rank the projects based on the ideas and ability to carry out the research rather than whether someone has published in Nature, Cell or Science. This could in turn remove the pressure to publish in big journals. I’ve often wondered how much of this could actually be drilled down to sheer laziness on the part of scientists perusing the literature and reviewing grants – “Which journals should I scan for recent papers? Just the big ones surely…” or “This candidate has published in Nature already, they’ll probably do it again, no need to read the proposal too closely.”

Of course I generalize and there are many crusaders out there (Michael Eisen, Randy Sheckman, Fiona Watt, etc.) pushing to change things and I mean them no offence. I just wish that more people could feel safe enough to follow their lead. In my own journey to start up a lab, I am under enormous pressure to publish in a big journal (i.e., my open-access PLoS Biology paper doesn’t make the grade and open source juggernaut e-Life has yet to achieve high-level status despite its many philosophical backers).

So, in sum, scientists in positions of power (peer reviewers, institute directors, funding panel chairs) are the real targets for change. Assess based on research merit, not journal label. Let’s make journals tools of communication, not power brokers of scientific careers.

logo_square

Who Do You Think You Are – Galen Strawson and Life Online

Author: @TheLitCritGuy
Original: TheLitCritGuy.com


One of the most often repeated complaints and criticisms around literary theory is that it lapses frequently into obscurantism and obfuscation. Whilst this is nothing but deeply unfair and inaccurate it has to be acknowledged that there is a great deal of theory that it often difficult to apply to the realities of modern life.  The effort of applying the abstract and removed language of the academie to the mundane details of existence is a hermeneutical exercise that we don’t always have the time or the energy to do.

This doesn’t mean that theory is irrelevant as how we construct and understand our lives are questions that theoretical writing directly concerns itself with – issues of identity, consciousness and perception are all areas that theorists have sought to understand. These complex issues are further problematized when one examines the shift in how the self finds cultural and social expression. It used to be that the predominate mode that this occurred in was face to face. We understood ourselves in the context of relationships, be they professional, familial or social. With the rise of technology and the now ubiquitous ‘social media’ that web of relationships has shifted online.

We have friends.

We have followers.

We get likes, RT’s and re-blogs.

Essentially, things have changed. Before I go any further this isn’t a plea for a return to a more idealistic and less technology driven social experience. The two modes of existence both share the same prevailing ideological model of how the individual understands themselves. We, speaking generally here, make sense of ourselves by constructing a narrative – one of the things that social media has done is make this process more obvious. One only has to look at facebook timelines to see the explicit construction of your subjectivity, your life as a coherent narrative, designed to make us look our very best.

To quote Dan Dennett;

 ‘We are all virtuoso novelists…We try to make all of our material cohere into a single good story. And that story is our autobiography. The chief fictional character…of that autobiography is one’s self’

Contained within the quote are two inter-related theses, which the great analytic philosopher and theorist Galen Strawson identified as the ‘Psychological Narrative Thesis’ and the ‘ethical Narrative Thesis.’

Let me explain – the Psychological Thesis is a descriptive and empirical argument about how we see the world, a way of understanding life that is integral to human nature.  The ‘Ethical Narrative Thesis’ is an argument coupled to the first which posits a narrative understanding of life – that having or conceiving one’s life in a narrative sense is necessary or essential  for developing true or full personhood.

Now, one can think that these two interrelated ideas are some combination of true or false but it’s worth examining how these two lines of argument operate online. The desire for narrative reflects our desire for coherence – we want desperately for the things we encounter online to make sense, to cohere in some way so it should come as no surprise that is how we treat others online.

The majority of the time this isn’t really an issue and one of the upsides of online culture is that it tends to treat people as whole and cohesive individuals. Basically, viewing people through the lens of a Narrative works out quite well most of the time – it allows us to make quick and generally fairly reliable judgements about the other and present ourselves in such a way that we can be easily comprehended too.

However, there is an issue here – the narrative thesis is a totalising one, a structuralist way of viewing the world and each other. The vast majority of the time it may be sufficient to view ourselves online as a seamless cohesive whole that tells a singular narrative story but this quickly runs into a problem – diachronic consistency.

To explain that in less technical sounding words, the idea that persistent through time is a recognizable thread of consciousness within one individual just doesn’t hold up. It is not the disconnection within online life that irks, but the flawed drive for all of this to make sense, for all of our lives to be tied together in one neat package. We become authors who edit on the fly, making ourselves the neatest and tidiest selves we can be, desperate to excise the disparate and the different and the dysfunctional.

This isn’t a new problem – to quote the great Virginia Woolf;

Look within and life, it seems, is very far from being “like this”. Examine for a moment an ordinary mind on an ordinary day. The mind receives a myriad impressions — trivial, fantastic, evanescent, or engraved with the sharpness of steel. From all sides they come, an incessant shower of innumerable atoms; and as they fall, as they shape themselves into the life of Monday or Tuesday, the accent falls differently from of old…Life is not a series of gig lamps symmetrically arranged; life is a luminous halo, a semi-transparent envelope surrounding us from the beginning of consciousness to the end.

Viewing these neat and tidy profiles, those expertly curated twitter streams and Woolf’s quote takes on fresh resonance. Life, indeed, does not seem to be like this. If social media and internet living is where we will all increasingly be it must become a place where the honest expression of the many different internal selves can find a place. Perhaps we need less narrative – less desire to be a coherent singular story that others *like* and more spaces where the individual can change, be contradictory and experience anew.

logo_square

The Valley of Shit

Author: Inger Mewburn
Original: Thesis Whisperer


I have a friend, let’s call him Dave, who is doing his PhD at the moment.

I admire Dave for several reasons. Although he is a full time academic with a young family, Dave talks about his PhD as just one job among many. Rather than moan about not having enough time, Dave looks for creative time management solutions. Despite the numerous demands on him, Dave is a generous colleague. He willingly listens to my work problems over coffee and always has an interesting suggestion or two. His resolute cheerfulness and ‘can do’ attitude is an antidote to the culture of complaint which seems, at times, to pervade academia.

I was therefore surprised when, for no apparent reason, Dave started talking negatively about his PhD and his ability to finish on time. All of a sudden he seemed to lose confidence in himself, his topic and the quality of the work he had done.

Dave is not the only person who seems to be experiencing these feelings lately. I have another friend, let’s call him Andrew.

Andrew is doing his PhD at a prestigious university and has been given an equally prestigious scholarship. Like Dave, Andrew approaches his PhD as another job, applying the many time management skills he had learned in his previous career. He has turned out an impressive number of papers, much to the delight of his supervisors.

Again I was shocked when Andrew emailed me to say he was going to quit. He claimed everything he did was no good and it took a number of intense phone calls to convince him to carry on.

Both these students were trapped in a phase PhD study I have started to call “The Valley of Shit”.

The Valley of Shit is that period of your PhD, however brief, when you lose perspective and therefore confidence and belief in yourself. There are a few signs you are entering into the Valley of Shit. You can start to think your whole project is misconceived or that you do not have the ability to do it justice. Or you might seriously question if what you have done is good enough and start feeling like everything you have discovered is obvious, boring and unimportant. As you walk deeper into the Valley of Shit it becomes more and more difficult to work and you start seriously entertaining thoughts of quitting.

I call this state of mind the Valley of Shit because you need to remember you are merely passing through it, not stuck there forever. Valleys lead to somewhere else – if you can but walk for long enough. Unfortunately the Valley of Shit can feel endless because you are surrounded by towering walls of brown stuff which block your view of the beautiful landscape beyond.

The Valley of Shit is a terrible place to be because, well, not to put too fine a point on it – it smells. No one else can (or really wants to) be down there, walking with you. You have the Valley of Shit all to yourself. This is why, no matter how many reassuring things people say, it can be hard to believe that the Valley of Shit actually does have an end. In fact, sometimes those reassuring words can only make the Valley of Shit more oppressive.

The problem with being a PhD student is you are likely to have been a star student all your life. Your family, friends and colleagues know this about you. Their confidence in you is real – and well founded. While rationally you know they are right, their optimism and soothing ‘you can do it’ mantras can start to feel like extra pressure rather than encouragement.

I feel like I have spent more than my fair share of time in the Valley of Shit. I was Thesis Whisperering while I was doing my PhD – so you can imagine the pressure I felt to succeed. An inability to deliver a good thesis, on time, would be a sign of my professional incompetence on so many levels. The Valley of Shit would start to rise up around me whenever I starting second guessing myself. The internal monologue went something like this:

“My supervisor, friends and family say I can do it – but how do they really KNOW? What if I disappoint all these people who have such faith in me? What will they think of me then?”

Happily, all my fears were groundless. My friends, teachers and family were right: I did have it in me. But boy – the smell of all those days walking in the Valley of Shit stay with you.

So I don’t want to offer you any empty words of comfort. The only advice I have is: just have to keep walking. By which I mean just keep writing, doing experiments, analysis or whatever – even if you don’t believe there is any point to it. Remember that you are probably not the right person to judge the value of your project or your competence right now.

Try not to get angry at people who try to cheer you on; they are only trying to help. Although you are alone in the Valley of Shit there is no need to be lonely – find a fellow traveller or two and have a good whinge if that helps. But beware of indulging in this kind of ‘troubles talk’ too much lest you start to feel like a victim.

Maybe try to laugh at it just a little.

You may be one of the lucky ones who only experience the Valley of Shit once in your PhD, or you might be unlucky and find yourself there repeatedly, as I did. I can completely understand those people who give up before they reach the end of the Valley of Shit – but I think it’s a pity. Eventually it has to end because the university won’t let you do your PhD forever. Even if you never do walk out the other side, one day you will just hand the thing in and hope for the best.

logo_square