On Emotions and Overthinking in Academia

@angry_prof | 25/10/16


I distinctly remember having one particularly confusing week in grad school in 2001. I was funded, published, and on track to complete my dissertation by age 27. But for some reason, that was the week I chose to lie extensively to my university, advisor, and family about having meningitis and spent the entire week on my sofa bed watching Maury Povich. No, this wasn’t the gut-punch anxiety of intentionally emailing the wrong attachment because my comprehensive exam was not finished on time, or the total emotional collapse after my significant other moved away. This didn’t make sense.

As a professor and professional overthinker, I’ve grown accustomed to confused looks when I explain a train of thought or how I make decisions; disquieting looks of incredulity mixed with sadness and a regrettable inability to empathize. Faces both impressed by the sheer volume of overlaid cognition and clearly appreciative of not having to live inside of it. And I’m fully aware that I produce similarly conflicted microexpressions when I hear “I love what I do” reflecting both a distain for flowery emotional language and a deep-seated envy of being able to suspend disbelief about the academic system long enough to develop feelings for it.

So I suppose it’s really not that surprising that there exist remarkably few people with the intestinal fortitude to tolerate my apparent inability to bask in the projected Hunger Games glory of tenure, persistent use of exile as a metaphor for sabbatical, and rehearsed disillusionment of academia as a dystopian, publisher-owned, ego-fuelled Matrix. I get that I’m not the most optimistic person, and that I should presumably have already gotten used to the interpersonal disconnect and ambivalent isolation afforded by an academia-trained propensity for overthought.

But maybe it’s FOMOOE – fear of missing out on overthinking everything – that kills the idea of optimism before it infects. Or maybe it’s my life-long membership to the cult of the next, that ever-lengthening pursuit of the perfect title, institution, journal, award, or mention by one’s academic hero – that pinhole of guiding light that will one day transform into a glorious beacon announcing one’s prophetic insight, intellectual ferocity, or near-death pursuit of knowledge to the world. That imagined validating end point making all the nights, compromises, and forgone personal life experiences worthwhile.

Or maybe it’s just me. Maybe it’s that academics like me tend to self-select into this heady ego system, tolerating a culture of intellectual prize-fighting at the expense of overworking the eager in order for those occasional strokes of ego to feel that much more self-soothing. That heart flutter of excitement when opening a conference notification email. That profound swelling of pride when seeing your name and affiliation formatted in columns in your publication PDF. That feeling of royalty when stepping off a plane in a foreign land to address to an adoringly naive, intellectually starved audience satisfied only by the acute physical apperception of soul-quenching speculation leaving your lips one syllable at a time.

I don’t know. Sometimes I think my experience in academia would be easier if I could better ignore how the intellectual stimulation of discovery or pride of publication doesn’t quite mask the loneliness of being the only one who understands what you do at your institution, or drinking alone at a hotel bar because everyone else at the conference was meeting up with old colleagues. I sometimes wonder if imposter syndrome is specific enough a label to cover feeling out of place not because of skills or reputation, but by having too many feelings or thinking too much about them. I also often wonder if my colleagues are really my friends, or if we’re just the only ones consistently left behind as students continually move on to more interesting developmental milestones and career challenges.

But what tends to bug me the most is that I can’t decide whether I think too much, feel too much, or both; whether I’m overthinking my feelings, or getting too emotional about the way I think. And then there’s trying to figure out if all this thinking and feeling is typical, if I am alone in wondering why all of this seems so confusing. Whether spending a week in bed means I’ve developed a remarkably sophisticated premature disillusionment with the publishing oligarchy dominating academic politics, or if I might just be depressed because I’m alone as would a normal person. It’s a confusing process trying to decide if being a good academic means harnessing all emotions toward the good of science, or alternatively, if having feelings that get in the way of writing means I’ve chosen the wrong profession.

The hypothesis that this extent of deliberation over my emotions makes me special is not supported by immediate responses to sarcastic attention flares on Twitter. However, it is readily debunked by body language from colleagues that very clearly tells me to stop talking because you’re making everyone uncomfortable. It’s not easy bringing up feeling confused, disillusioned, sad, lonely, or depressed in academic circles without worrying about how it will impact departmental politics or your professional reputation. And I’m not saying I’m particularly adept at expressing these sentiments or admitting when I need help, but I have learned a few things since grad school.

First, I am not alone. I have learned to recognize a familiar pain in the eyes of students, post-docs, and fellow faculty when I talk about the struggle to maintain self-care or personal relationships in the face of teaching demands or the pressure to always be writing. I now notice the quiet nods from colleagues when intimating through a change in tone or well-timed silence how truly lonely it can be to live inside your head for a living. And just as I’ve tried to create a safe space for students to yell or cry over illness, disability, loss, discrimination, finances, family, or even a manuscript rejection, I’ve also seen full professors completely break down when things were too much.

Second, saying these things out loud takes practice. Yes it does feel exceptionally weird and like an explicit admission of weakness or collective betrayal to admit doubting yourself, regretting academic career decisions, or acknowledging that your love for what you do may not be strong enough to compensate for its emotional toll. But there are few things like hearing yourself say the words “I don’t enjoy this any more” or “I think I’m just really lonely” out loud to kickstart your academic propensity to problem solve or to stumble across someone you actually believe when they say “I hear you” or “it will be ok”.

Finally, I’ve learned that although I may as an academic be able to convince myself that my emotions are too complicated or specialized for colleagues, friends, family, or the general public to appreciate, this is complete bullshit. Arguably the most reliable consequence of assuming that my feelings were not understandable by others because they concerned impact factors, letters to editors, intradisciplinary norms, training doctoral students, or teaching/evaluating higher-order cognition was that I was left feeling even more alone than before.

In my experience, academics are not a special breed immune to basic emotions, but instead uniquely equipped to paint ourselves into a corner of isolation by convincing ourselves that our experiences are qualitatively unique as evidenced by others not understanding what we say or do. Feeling embarrassed of not being able to keep a promise to yourself is not unique. Feeling shame when facing unmistakable consequences of choosing your career over your family does not make you special. Wondering if you’ll ever achieve a level of success where you won’t feel like an imposter is so common they’ve had a label for it since like the 70’s.

If admitting you have these feelings is the first step to feeling less alone, the next step is probably swallowing your pride and putting it as simply as possible. Although perhaps not as metacognitively satisfying as “mitigating affective disengagement by way of linguistic transduction and affiliation”, being honest about how you feel might require the humbling realization that although your work might set you apart, your feelings don’t. Whether starting with sarcastic quips on Twitter or a trip to your friendly neighborhood psychologist, there are people who listen if you try to say something.

In an academic world where cognition is currency and publication is king, I understand the academic disinterest toward emotions not involving passion, inspiration, or perseverance that can distract from writing and contributing to science. I’m just saying that pursuing your academic dreams can lead to treating your emotions like an afterthought, and that as overthinkers, we can probably do better.

logo_square

 

 

 

Could Parental Leave Actually be Good for my Academic Career?

Author: David Kent
Original: University Affairs | The Black Hole


Last autumn, I started my research lab at the University of Cambridge’s Stem Cell Institute, but this coming summer I’m doing something completely different – I’m taking parental leave with my first child. I must admit that at least some inspiration came from my brother, who took a term off with his second child and said it was one of the best decisions he’d ever made.

It’s been a tough journey to get a group leader position – 11 years of intense research-focused time, most of which were spent in a complete black hole of uncertainty with respect to my future career. And now, I won’t be in the lab for 14 weeks – we’ll see how it all works out.

Reaction to my decision amongst non-academic family and friends was pretty much universally positive, but reaction from academic colleagues was highly variable – a substantial number of whom think I’m absolutely crazy to take off so much time within the first year of my research lab’s existence. I wasn’t too surprised by this, having emerged from the North American system where parental leave is much less generous than in Europe. What I didn’t expect were the other reactions …

In November, I was at a national cancer conference and at one of the evening receptions I spoke with a female scientist from another U.K. university about women in science. Over the course of the discussion, I mentioned that my partner and I would be taking advantage of the U.K.’s new “Shared Parental Leave” policy, with my partner taking 8.5 months of leave and me taking 3.5 months. She said she was shocked and surprised that a brand new group leader would take the time off, but also said “good for you.”

The next evening is when things really hit home though. After the conference dinner I was on the dance floor and a complete stranger came up to me and asked, “Are you David Kent?” I assumed she had seen my presentation earlier in the day until she continued, “the David Kent who is taking parental leave as a new group leader? I just wanted to say thank you.” We chatted a little and it was as simple as this: a male group leader taking parental leave was just not that common, especially not a 3.5-month block of time. The professor from the other night had clearly gone off and told her colleagues and word had spread.

Here I was being showered with praise for taking 3.5 months off work and feeling pretty good about my decision until I did a quick comparison to my partner’s situation, also an early career scientist. Not only would she be taking nearly three times the amount of leave, but she’s also been carrying a baby around for eight months whilst undertaking world-class research. Is there a small fan club of approving academics lined up to congratulate her on the brave decision to spend time with her child? Not that I’ve seen.

So, in effect, my taking a short block of parental leave has boosted my profile in the eyes of some academics and her taking a longer block will put her in the challenging position that so many young female academics find themselves in: trying to play catch-up and pretend that children haven’t impacted their careers (many do not acknowledge children on CVs, job applications, etc., for fear of being viewed unfavourably). The science community needs to embrace rather than shun such individuals.

Overall, if universities want more women in science, then the way we handle babies and families needs to change – men need to be as “risky” to hire as women. But change does not come overnight and it does not come easy. As a start, more countries (and institutions) need to have “use it or lose it” policies, such as exists in Quebec – the father is given a block of time that the mother cannot use. Universities and individuals need to fight for this. Countries such as Sweden have seen incredible results from such policies and are amongst the world leaders in having women in senior positions. For science specifically, granting agencies need to behave like the European Research Council with respect to eligibility windows and like EMBO for postdoctoral fellowships – creating small allowances for young parents that make the journey just a little bit easier.

Or perhaps we should just force them all out of science – that seems to be the way things are currently set up and it makes me worry for our future science workforce.

logo_square

It’s OK to Quit Your PhD

Author: Jennifer Polk
Original: From PhD to Life


Occasionally I’m asked about quitting, particularly “quitting” a PhD program. This happened several times last week, when I was in Vancouver.

Contrary to what you may hear or what your own internal critics tell you, there’s no shame in moving on. I remember a long post on a Versatile PhD forum from “PJ,” an ABD thinking about leaving instead of spending another two years (minimum) to finish their PhD. In response, one commenter wrote, “But the real question is, do you want to be a quitter? Now, not everyone will view that question the same, and I’m sure many will say that equating quitting a PhD program to being a quitter is not valid, but in reality, it is.” No! Thankfully, most other commenters on the thread offered more nuanced and helpful reflections and advice. “Finishing is not just about the destination,” one former tenure-track professor pointed out. “If that’s the only thing you want, then it’s a tough few years ahead.” Indeed.

Before you make the decision to leave, separate your inner critic – who may well be reflecting outer critics in your life – from what you know is right for you. Trust your gut, not your gremlin. In my experience, this is a decision that individuals make and re-make over time. I’ve worked with a few clients who’ve contemplated not finishing their PhD programs. While you figure out what you want, it’s ok to be ambivalent, carrying on the work but distancing yourself psychologically and emotionally from academia. What are your goals? Once you know them, you can determine the correct strategy to move toward them. (With thanks to Harvey P. Weingarten’s recent post.)

The “no one likes a quitter” attitude that exists in graduate school and perhaps in academia writ large isn’t warranted. There is nothing inherently good or bad about completing a PhD. It’s only a good move for you if it is a good move for you. While individuals who depart sans degree will come to their own personal conclusions about their decisions, the wide world rarely cares. It’s instructive that in PJ’s original post, they mentioned that their former undergraduate professors were unanimous in advising them to quit. I’ll let English professor (and graduate advisor) Leonard Cassuto speak for ideal advisors everywhere: “Most of my advisees finish their dissertations and get jobs. I’m proud of them. But some walk away – and of that group I’m just as proud” (Graduate School Mess, p. 121). I feel the same way about my own clients, whatever path they choose to take.

A while back Christine Slocum reflected on her career journey in a Transition Q & A post. She’d completed an MA and then two years of a PhD program, then moved on before achieving ABD status. In her post she explains there were several reasons for her choice, including feeling burnt out, lack of community in her department, and desire to start a family. Pursuing the doctorate no longer meshed with her goals: “After some soul searching, I remembered that the reason I was pursuing sociology in the first place was to better understand the mechanisms of social stratification because I wanted to better understand how to undo it. ​Four years of graduate study [later,] I felt like I had enough that the next five years would be better spent working for an NGO, nonprofit, or government position getting practical experience in the field.”

Heather Steel made a similar decision when she decided not to continue her PhD in the midst of dissertating. She learned important information about herself during graduate school. “There were parts of my program that I enjoyed very much (classes, having the chance to read and think, teaching, and my colleagues), but in the end,” she realized, “sitting for hours in front of a microfilm reader to write something that few people would actually read was not fulfilling.” Heather learned that she enjoys “research in small doses, not projects that take years to see results.” When I did an informational interview with her during my transition, I learned that she didn’t regret her choices. Her career has continued to progress since then.

When I was in Vancouver, a graduate student in the audience at one of my talks shared his own story: He’d been enrolled in a PhD programs years before, then left. But here he was back doing another doctorate! He was nearly done, and this time around he knew it was the correct path for him. I know several people who’ve done similar things, for a variety of reasons. Fascinating, eh?

If completing your PhD is the right move for you, carry on. Get support and help wherever you can find it, go part-time, or take a break or leave or absence. Make whatever changes you need to smooth your journey. But if the doctorate no longer makes sense — your goals have changed, you’ve learned more about yourself over the years — then I’ve got your back (in spirit) in deciding not to continue. You’re not “quitting” or “leaving”; instead, you’re embarking on a new, better-for-you path, taking what you learned and experienced and applying it in a context that’s more suitable to who you are, how you work best, and where you want to go. That’s risky and brave, but it’s also just you standing up for yourself. It took me until after my PhD to do that. Feel free to do as I didn’t.

logo_square

Scientists Have the Power to Change the Publishing System

Author: David Kent
Original: University Affairs | The Black Hole


Earlier this month I read an article by Julia Belluz that ripped into the scientific publishing system. The saddest, and truest, sentiment of the article can be summed up in the following quotation:

“Taxpayers fund a lot of the science that gets done, academics peer review it for free, and then journals charge users ludicrous sums of money to view the finished product.”

This is certainly not the first attack against the publishing process nor the first to encourage open-access publishing. In the remainder of her article, Ms. Belluz focuses on the role that governments can play in getting more scientific research freely and instantly available. In sum, she suggests that government funding agencies (e.g., the United States National Institutes of Health or the Canadian Institutes of Health Research) could refuse to give grants to those scientists who did not publish in open-access journals.

This is a laudable, and indeed it is the approach being taken bit by bit by funding agencies – the Wellcome Trust in the U.K. for example has a very robust open access policy that includes providing grant funding for the open-access charges. While this will certainly get more research out sooner and without charge, I believe it misses out on an important aspect of the power dynamic that plagues the scientific publishing process.

The fact is that journals with high impact factors wield enormous power because they hold the key to scientists’ careers – the field has become so obsessed with metrics that it is insufficient to be a good scientist with good ideas and the ability to perform good research. As things stand now, if you want research grants (and in most cases, this means if you want a job), then you need to publish a paper (or several!) with a big-name journal.

So what can scientists do? Well, it turns out scientists are involved in just about every aspect of the publishing power dynamic. First, one needs to understand what’s at stake. Scientists want big name papers for three main reasons:

  1. Grants
  2. Jobs
  3. Recognition

However, papers in big-name journals do not directly give you grants or jobs, nor are they the only way to be recognized as a good scientist. Other scientists make these decisions, but far too often their judgment is impacted by the glitz and glam of the big-name journals.

Jobs are often won by those doing research that has good institutional fit – they bring a novel technology, a new way of looking at things, or a broad network of excellent former colleagues – but jobs are often lost because the candidate is “not fundable.” The latter is more often than not decided based on where they have published and how a grants panel will view them. So it basically comes down to who can get grants. And who generally decides funding outcomes? Scientists.

I wonder how many grant panels have heard the phrase “the project looks good, but the candidate has only ever published in mid-range journals.” Indeed, I know several scientists who rank applications based on a candidate’s publication record irrespective of how good or bad the project is or how well-resourced the working environment is.

One suggestion: Ban the CV from the grant review process. Rank the projects based on the ideas and ability to carry out the research rather than whether someone has published in Nature, Cell or Science. This could in turn remove the pressure to publish in big journals. I’ve often wondered how much of this could actually be drilled down to sheer laziness on the part of scientists perusing the literature and reviewing grants – “Which journals should I scan for recent papers? Just the big ones surely…” or “This candidate has published in Nature already, they’ll probably do it again, no need to read the proposal too closely.”

Of course I generalize and there are many crusaders out there (Michael Eisen, Randy Sheckman, Fiona Watt, etc.) pushing to change things and I mean them no offence. I just wish that more people could feel safe enough to follow their lead. In my own journey to start up a lab, I am under enormous pressure to publish in a big journal (i.e., my open-access PLoS Biology paper doesn’t make the grade and open source juggernaut e-Life has yet to achieve high-level status despite its many philosophical backers).

So, in sum, scientists in positions of power (peer reviewers, institute directors, funding panel chairs) are the real targets for change. Assess based on research merit, not journal label. Let’s make journals tools of communication, not power brokers of scientific careers.

logo_square

The Valley of Shit

Author: Inger Mewburn
Original: Thesis Whisperer


I have a friend, let’s call him Dave, who is doing his PhD at the moment.

I admire Dave for several reasons. Although he is a full time academic with a young family, Dave talks about his PhD as just one job among many. Rather than moan about not having enough time, Dave looks for creative time management solutions. Despite the numerous demands on him, Dave is a generous colleague. He willingly listens to my work problems over coffee and always has an interesting suggestion or two. His resolute cheerfulness and ‘can do’ attitude is an antidote to the culture of complaint which seems, at times, to pervade academia.

I was therefore surprised when, for no apparent reason, Dave started talking negatively about his PhD and his ability to finish on time. All of a sudden he seemed to lose confidence in himself, his topic and the quality of the work he had done.

Dave is not the only person who seems to be experiencing these feelings lately. I have another friend, let’s call him Andrew.

Andrew is doing his PhD at a prestigious university and has been given an equally prestigious scholarship. Like Dave, Andrew approaches his PhD as another job, applying the many time management skills he had learned in his previous career. He has turned out an impressive number of papers, much to the delight of his supervisors.

Again I was shocked when Andrew emailed me to say he was going to quit. He claimed everything he did was no good and it took a number of intense phone calls to convince him to carry on.

Both these students were trapped in a phase PhD study I have started to call “The Valley of Shit”.

The Valley of Shit is that period of your PhD, however brief, when you lose perspective and therefore confidence and belief in yourself. There are a few signs you are entering into the Valley of Shit. You can start to think your whole project is misconceived or that you do not have the ability to do it justice. Or you might seriously question if what you have done is good enough and start feeling like everything you have discovered is obvious, boring and unimportant. As you walk deeper into the Valley of Shit it becomes more and more difficult to work and you start seriously entertaining thoughts of quitting.

I call this state of mind the Valley of Shit because you need to remember you are merely passing through it, not stuck there forever. Valleys lead to somewhere else – if you can but walk for long enough. Unfortunately the Valley of Shit can feel endless because you are surrounded by towering walls of brown stuff which block your view of the beautiful landscape beyond.

The Valley of Shit is a terrible place to be because, well, not to put too fine a point on it – it smells. No one else can (or really wants to) be down there, walking with you. You have the Valley of Shit all to yourself. This is why, no matter how many reassuring things people say, it can be hard to believe that the Valley of Shit actually does have an end. In fact, sometimes those reassuring words can only make the Valley of Shit more oppressive.

The problem with being a PhD student is you are likely to have been a star student all your life. Your family, friends and colleagues know this about you. Their confidence in you is real – and well founded. While rationally you know they are right, their optimism and soothing ‘you can do it’ mantras can start to feel like extra pressure rather than encouragement.

I feel like I have spent more than my fair share of time in the Valley of Shit. I was Thesis Whisperering while I was doing my PhD – so you can imagine the pressure I felt to succeed. An inability to deliver a good thesis, on time, would be a sign of my professional incompetence on so many levels. The Valley of Shit would start to rise up around me whenever I starting second guessing myself. The internal monologue went something like this:

“My supervisor, friends and family say I can do it – but how do they really KNOW? What if I disappoint all these people who have such faith in me? What will they think of me then?”

Happily, all my fears were groundless. My friends, teachers and family were right: I did have it in me. But boy – the smell of all those days walking in the Valley of Shit stay with you.

So I don’t want to offer you any empty words of comfort. The only advice I have is: just have to keep walking. By which I mean just keep writing, doing experiments, analysis or whatever – even if you don’t believe there is any point to it. Remember that you are probably not the right person to judge the value of your project or your competence right now.

Try not to get angry at people who try to cheer you on; they are only trying to help. Although you are alone in the Valley of Shit there is no need to be lonely – find a fellow traveller or two and have a good whinge if that helps. But beware of indulging in this kind of ‘troubles talk’ too much lest you start to feel like a victim.

Maybe try to laugh at it just a little.

You may be one of the lucky ones who only experience the Valley of Shit once in your PhD, or you might be unlucky and find yourself there repeatedly, as I did. I can completely understand those people who give up before they reach the end of the Valley of Shit – but I think it’s a pity. Eventually it has to end because the university won’t let you do your PhD forever. Even if you never do walk out the other side, one day you will just hand the thing in and hope for the best.

logo_square

The Productivity Robbing Myths of Grad School

Author: Steve Shaw
Original: How Not To Suck at Grad School


I am not sure if there is a best way to be efficient and productive as there are many very different, but positive, ways to work. However, there are some common and universally terrible ways to work. Here are a few things that I hear students say with pride that are actually signs of an inefficient worker.

“I do my best work at the last minute. I thrive under pressure.”

–No. The first draft of everything is terrible, even for the best writer. You may be an extremely good binge writer, but I promise that the work will be better with another draft and some time to consider and change content.  Plan your time well. The draft of any project should be completed three days to two weeks before it is due. The remainder of the time can be spent in the real work of writing: editing.

“I am not a detail person. I am an idea person.”

–Ideas that are well-researched, communicated in detail, completely thought out, and effectively implemented are useful. All others tend to be vague dreams that borderline on hallucinations. Everyone is a dreamer, but the truly useful person works hard and uses detail to convert dreams into reality.

“I am a perfectionist.”

–This is not a positive trait. Trying to pursue perfection is a useless activity that is harmful to well-being and productivity. Being conscientious, detail focused, and striving for excellence are laudable characteristics. Perfectionism is maladaptive.

When I hear people tell me that they are a perfectionist, I feel the need to assess further to determine if we simply are defining perfectionism differently or if their behavior is maladaptive. Usually people mean that they are detail focused and striving for excellence with undertones of anxiety. This is typically a good set of characteristics for grad students. But when they mention the need to be perfect, then we are into a zone where anxiety may be maladaptive. Seeking excellence is good. Seeking perfection is a neurotic waste of time.

“I edit while I write.”

–This is a guaranteed method of getting nothing finished or severely limiting your productivity. Get all of your ideas out on paper. Only edit when you have completed a document or at least a substantial portion. Editing while writing is slow, makes for choppy prose, reduced flow and creativity, and increases anxiety. People with this habit also tend to be perfectionists and have learned this habit while doing last minute work. Take the time to complete a full draft and then edit.

“I don’t want to show this to you until it is ready.”

–I understand this secrecy problem. Some supervisors are extremely judgmental and even hostile to unfinished work. Submitting any work is aversive under these conditions. The best approach is to have students submit work on a timed basis, even if it is raw. The difference between a professional and an amateur writer is deadlines. Working to a deadline is more important than achieving the mythic ideal paper. I also find that when students wait to submit their ideal paper that they are crushed when substantial revisions are to be made. The supervisor can make suggestions, edits, improve the paper and move on without judgment. The goal is to develop a relationship that produces a large amount of scholarly material in an efficient manner. Trust between a student and supervisor is the best way to make this happen. When the secrecy issue is fostered we are teaching grad students to be perfectionists and adding anxiety to their lives.

“I’m a multi-tasker.”

–You are not. You can only attend to one task at a time. Many folks have developed a sophisticated skill set where they actively shift attention from one task to another. You attend to the television for a few minutes and then back to your book—you cannot do both at the same time. That counts for radio or music as well. You can focus on music or focus on your work, not both. What we tend to do is shift attentional focus. If you are listening to music and you know what was playing and enjoyed it, then you are shifting focus. Once you are in an activity where you are shifting focus between two things, then your efficiency is being robbed. There is some evidence that music with a constant beat and no lyrics can actually aid in concentration and focus. Classical music is an example. When I am at my most scattered, I listen to a metronome to help with focus. But no one is truly multitasking, you are rapidly shifting attention and reducing efficiency. This is not necessarily bad, but inefficient and needs to be used sparingly.

My wife works from home with the TV on.  She says that she likes the noise while she works. However, when I ask her what she is watching on television, she has no idea. She is certainly losing some focus, but not as much as she would if she was at all attending to the TV. I watch television while working only on weekends. I am mostly watching TV, but get a little work done at commercials. Not efficient and focused work, but better than nothing.

White noise can be a better idea than music or TV. White noise can be ideal for folks who like a level of sound to mask the often jarring ambient noise of your real environment such as construction, lawn maintenance, and loud neighbors. There are several white noise generators available online such as http://mynoise.net/NoiseMachines/whiteNoiseGenerator.php and http://simplynoise.com/ . One of my favourite websites and apps is http://www.coffitivity.com/. This site plays the ambient noise from a coffee shop. You can even select the type of coffee shop noise from “morning murmur” to “lunchtime lounge” to “university undertones.” This style of white noise is also helpful for the folks who actually prefer to do creative work in coffee shops, but cannot get there. I do not understand how people do this as my attention flits to the homeless guy, the hostile person in a long line, and the sounds of coffee slurpers; nonetheless many people do their creative work in coffee shops. The white noise from coffitivity is associated with a place of creativity, which can put you in the mood to work. The secret of white noise is that there is no content in the noise to draw attention away from your work.

Once I learned the skill of unitasking, I became at least twice as efficient as before. Now I do one thing fully focused until completed and then turn my attention to the next task. Not only is my work completed at a faster pace as a unitasker; I enjoy movies, TV, and music much more. And as an extra bonus, there are not the nagging feelings of guilt that go along with such multitasking.

We all develop work habits and there are many ways to be a productive worker. But as grad students and professors have increased pressures to produce the limits of our work habits are often reached and exceeded. What worked as an undergrad no longer works and now falls under the heading of a maladaptive habit. There is a constant need to hone work habits and remove of the productivity robbing myths and habits from your work.

logo_square

Giving Up On Academic Stardom

Author: Eric Grollman
Original: Conditionally Accepted


I have bought into the ego-driven status game in academia. Hard. I find myself sometimes wondering more about opportunities to advance my reputation, status, name, and scholarship than about creating new knowledge and empowering disadvantaged communities. Decision-making in my research often entails asking what will yield the most publications, in the highest status journals with the quickest turnaround in peer-review. I often compare my CV to others’, wondering how to achieve what they have that I have not, and feeling smug about achieving things that haven’t. Rarely do I ask how to become a better researcher, but often ask how to become a more popular researcher.

I have drunk the Kool-Aid, and it is making me sick. Literally. The obsession with becoming an academic rockstar fuels my anxiety. I fixate on what is next, ignore the present, and do a horrible job of celebrating past achievements and victories. I struggle to accept “acceptable.” I feel compelled to exceed expectations; I take pride when I do. “Wow, only six years in grad school?” “Two publications in your first year on the tenure track?! And, you’re at a liberal arts college?”

When did I become this way? Sure, academia is not totally to blame. My parents expected me to surpass them in education (they have master’s degrees!). I also suffer, as many gay men do, with the desire to excel to gain family approval, which is partially lost upon coming out. Excelling in college, rather than becoming an HIV-positive drug addict, helped my parents to accept my queer identity. In general, I compensate professionally and socially for my publicly known sexual orientation. It is hard to unlearn the fear one will not be loved or accepted, especially when homophobes remind you that fear is a matter of survival.

Oh, but academia. You turned this achievement-oriented boy into an anxious wreck of a man. It is not simply a bonus to be an academic rockstar of sorts. My job security actually depends on it. And, it was necessary to be exceptional to even get this job. And, it matters in other ways that indirectly affect my job security, and my status in general. You can forget being elected into leadership positions in your discipline if no one knows you. “Who?” eyes say as they read your name tag at conferences before averting their gaze to avoid interacting. I have learned from my critics that one must be an established scholar before you can advocate for change in academia.

The Consequences Of Striving For Academic Stardom

I am giving up on my dream to become the Lady Gaga of sociology. I have to do so for my health. I have to stop comparing myself to other scholars because so many things vary, making it nearly impossible to find a truly fair comparison. Of course, I will never become the publication powerhouse of an Ivy League man professor whose wife is a homemaker. Even with that example, I simply do not know enough about another person’s life, goals, and values to make a comparison. I do not want others to compare themselves to me because my level of productivity also entails Generalized Anxiety Disorder. I am not a good model, either!

Dreams of academic stardom prevent me from appreciating my present circumstances, which were not handed to me. Sadly, voices, which sound awfully similar to my dissertation committees’, have repeatedly asked, “are you surrreeee you don’t want to be at an R1?” I have zero interest in leaving, and negative interest (if that is possible) in enduring the job market again. But, I fear that, as I was warned, I will become professionally irrelevant; and, this has made it difficult to fully appreciate where I am. I have acknowledged the reality that no place will be perfect for an outspoken gay Black intellectual activist. But, I have found a great place that holds promise for even better.

Beyond my health, the lure of academic stardom detracts from what is most important to me: making a difference in the world. Impact factors, citation rates, and the number of publications that I amass distract from impact in the world and accessibility. It is incredibly selfish, or at least self-serving, to focus more energy on advancing my own career rather than advancing my own communities.

Obsession with academic rockstardom forced me to view colleagues in my field as competition. My goal is to demonstrate what I do is better than them in my research. In doing so, I fail to see how we can collaborate directly on projects, or at least as a chorus of voices on a particular social problem. Yet, in reality, no individual’s work can make a difference alone. I also fail to appreciate the great things my colleagues accomplish when I view it only through jealous eyes.

When I die, I do not want one of my regrets to be that I worked too hard, or did not live authentically, or did not prioritize my health and happiness as much as I did my job.  Ok, end of rant.

logo_square

The Lie Guy

Author:
Original: Chronicle of Higher Education


You’d think I’d get used to being called a liar. After all, I’ve written a candid, semiautobiographical novel about being a scam artist, been interviewed in the media about my former life of lying, cheating, and drinking, even edited a prominent philosophical collection on deception. But when a colleague recently ridiculed me about being known as a liar, my feelings were hurt. I have a new life. I’ve been clean and sober and “rigorously honest” (as we say in AA) for two years. Still, to tell you the truth (honestly!), I earned my reputation fair and square.

In the Internet age, a sordid past is a matter of very public rec­ord—for that matter, of public exaggeration—and if you write fiction and memoir about your worst days, as I did (and continue to do), even your students will take the time to read the racy parts (or at least excerpts in online interviews of the racy parts, or YouTube interviews about the racy parts).

God bless and keep tenure—I’d probably hesitate to be frank in this essay without it—although, to be fair to my institution, the ignominious stories about me and my novel were out before my committee granted me tenure. “It takes an odd person to work on lying,” my late mentor (and friend and co-author), the philosopher Robert C. Solomon, once told me, himself having written one or two of the best papers on the subject.

When I was 26 years old, in 1993, I dropped out of grad school at the University of Texas at Austin—I was on a fellowship, staring day after day at my stalled dissertation among stacks of books and papers from the Kierkegaard Archive in the Royal Library in Copenhagen—to go into the luxury-jewelry business. I decided to burn all of my bridges. I didn’t fill out any forms. I didn’t have the ordinary courtesy even to contact my two dissertation directors, Solomon and Louis H. Mackey. I just vanished.

I told myself that it was a conscious strategy, to prevent myself from going back, but I also knew the truth: that I was simply too ashamed to tell them that I had gone into business for the money. Like many of our deceptions, mine was motivated by cowardice: “Tell the people what they want to hear,” or, if you can’t do that, simply don’t tell them anything at all.

A few years later, my next-door neighbor (my wife and I had just moved in) caught me in the driveway and asked, “Hey, Clancy. Did you go to grad school at the University of Texas?”

“I did, that’s right.” I was already uncomfortable. I opened the door of my convertible. The Texas summer sun frowned cruelly down on me.

“I’m an editor of Bob Solomon’s. He told me to say hello.”

Busted. This was Solomon’s way of calling me on my b.s. It was his personal and philosophical motto, adopted from Sartre: “No excuses!” Take responsibility for your actions. Above all, avoid bad faith. Look at yourself in the mirror and accept—if possible, embrace—the person that you are.

But I was on my way to work, and Bob Solomon, at that point in my life, was the least of my problems. I had him stored neatly in the mental safety-deposit box of “people I had not lied to but had betrayed in a related way.”

The jewelry business—like many other businesses, especially those that depend on selling—lends itself to lies. It’s hard to make money selling used Rolexes as what they are, but if you clean one up and make it look new, suddenly there’s a little profit in the deal. Grading diamonds is a subjective business, and the better a diamond looks to you when you’re grading it, the more money it’s worth—as long as you can convince your customer that it’s the grade you’re selling it as. Here’s an easy, effective way to do that: First lie to yourself about what grade the diamond is; then you can sincerely tell your customer “the truth” about what it’s worth.

As I would tell my salespeople: If you want to be an expert deceiver, master the art of self-deception. People will believe you when they see that you yourself are deeply convinced. It sounds difficult to do, but in fact it’s easy—we are already experts at lying to ourselves. We believe just what we want to believe. And the customer will help in this process, because she or he wants the diamond—where else can I get such a good deal on such a high-quality stone?—to be of a certain size and quality. At the same time, he or she does not want to pay the price that the actual diamond, were it what you claimed it to be, would cost. The transaction is a collaboration of lies and self-deceptions.

Here’s a quick lesson in selling. You never know when it might come in handy. When I went on the market as a Ph.D., I had six interviews and six fly-backs. That unnaturally high ratio existed not because I was smarter or more prepared than my competition. It was because I was outselling most of them.

Pretend you are selling a piece of jewelry: a useless thing, small, easily lost, that is also grossly expensive. I, your customer, wander into the store. Pretend to be polishing the showcases. Watch to see what is catching my eye. Stand back, let me prowl a bit. I will come back to a piece or two; something will draw me. You see the spark of allure. (All great selling is a form of seduction.) Now make your approach. Take a bracelet from the showcase that is near, but not too near, the piece I am interested in. Admire it; polish it with a gold cloth; comment quietly, appraisingly on it. You’re still ignoring me. Now, almost as though talking to yourself, take the piece I like from the showcase: “Now this is a piece of jewelry. I love this piece.” Suddenly you see me there. “Isn’t this a beautiful thing? The average person wouldn’t even notice this. But if you’re in the business, if you really know what to look for, a piece like this is why people wear fine jewelry. This is what a connoisseur looks for.” (If it’s a gold rope chain, a stainless-steel Rolex, or something else very common and mundane, you’ll have to finesse the line a bit, but you get the idea.)

From there it’s easy: Use the several kinds of lies Aristotle identified in Nicomachean Ethics: A good mixture of subtle flattery, understatement, humorous boastfulness, playful storytelling, and gentle irony will establish that “you’re one of us, and I’m one of you.” We are alike, we are friends, we can trust each other.

The problem is, once lying to your customer as a way of doing business becomes habitual, it reaches into other areas of your business, and then into your personal life. Soon the instrument of pleasing people becomes the goal of pleasing people. For example, who wouldn’t want to buy a high-quality one-carat diamond for just $3,000? (Such a diamond would cost $4,500 to $10,000, retail, depending on where you buy it.) But you can’t make a profit selling that diamond for $3,000—you can’t even buy one wholesale for that amount. Since the customer can’t tell the difference anyway, why not make your profit and please the customer by simply misrepresenting the merchandise? But that’s deceptive trade! There are laws against that! (There’s a body of federal law, in fact: the Uniform Deceptive Trade Practices Act. Texas awards triple damages plus attorney’s fees to the successful plaintiff.) Aren’t you worried about criminal—or at least civil—consequences? And how do you look at yourself in the mirror before you go to bed at night?

During my bleakest days in business, when I felt like taking a Zen monk’s vow of silence so that not a single lie would escape my lips, I often took a long lunch and drove to a campus—Southern Methodist University, Texas Christian University, the University of Texas at Arlington—to see the college kids outside reading books or holding hands or hurrying to class, and to reassure myself that there was a place where life made sense, where people were happy and thinking about something other than profit, where people still believed that truth mattered and were even in pursuit of it. (OK, perhaps I was a bit naïve about academic life.)

I was in the luxury-jewelry business for nearly seven years, and though I don’t believe in the existence of a soul, exactly, I came to understand what people mean when they say you are losing your soul. The lies I told in my business life migrated. Soon I was lying to my wife. The habit of telling people what they wanted to hear became the easiest way to navigate my way through any day. They don’t call it “the cold, hard truth” without reason: Flattering falsehoods are like a big, expensive comforter—as long as the comforter is never pulled off the bed.

It seemed that I could do what I wanted without ever suffering the consequences of my actions, as long as I created the appearance that people wanted to see. It took a lot of intellectual effort. I grew skinnier. I needed more and more cocaine to keep all my lies straight. And then, one morning, I realized that I had been standing in “the executive bathroom” (reserved for my partner and myself) at the marble sink before a large, gilt Venetian mirror every morning for days, with my Glock in my mouth (in the jewelry business, everyone has a handgun). I still remember the oily taste of that barrel. Before I confronted the fact that I was trying to kill myself, I had probably put that gun in my mouth, oh, I don’t know—20, 30 times. I said, “Enough.”

I called Bob Solomon. That was in May of 2000.

I was relieved when he didn’t answer his phone. I left a message: “I’m sorry, Dr. Solomon. I’d like to come back.” Words to that effect, but at much greater length. I think the beep cut me off.

When he called back, I was too frightened to pick up. I listened to his voice-mail message. He said, “Clancy, this is not a good time to make yourself difficult to get ahold of.”

I called again. He let me off easy. (He was perhaps the most generous person I’ve ever known.) I caught him up with the past six years of my life. He told me to call him Bob, not Dr. Solomon: “We’re past that.” Then he said, “So, why do you want to come back?”

“I want to finish what I started, Bob.”

“That’s a lousy reason. Try again.”

“I need to make a living that’s not in business. I hate being a businessman, Bob.”

“So be a lawyer. Be a doctor. You’ll make more money. It’s not easy to get a job as a professor these days, Clancy.”

“It’s the one thing I really enjoyed. Philosophy was the only thing that ever truly interested me. And I have some things I want to figure out.”

“Now you’re talking. Like what? What are you thinking about?”

“Lying. Or failure. I feel like I know a lot about both of them right now.”

(I was writing a long essay about suicide, which, come to think of it, might have been more to the point at the time. But I didn’t want to scare him off.)

A beat.

“Nobody wants to read about failure. It’s too depressing. But lying is interesting. Deception? Or self-deception? Or, I’m guessing, both?”

“Exactly. Both. How they work together.”

With the help of a couple of other professors who remembered me fondly, in the fall semester of 2000, Bob Solomon brought me back to the philosophy doctoral program at Austin, and I started work on a dissertation called “Nietzsche on Deception.” One of the other graduate students—Jessica Berry, now one of philosophy’s best young Nietzsche scholars—called me “the lie guy,” and the moniker stuck.

I went to work on deception not because I wanted to learn how to lie better—I had mastered the art, as far as I was concerned—but because I wanted to cure myself of being a liar. What had started out as a morally pernicious technique had become a character-defining vice. I had to save myself. I needed to understand the knots I had tied myself into before I could begin to untangle them. (It seems like an odd solution now. At the time, I thought I was too smart for therapy.)

It’s an old idea, of course: The Delphic injunction “Know thyself” is an epistemological duty with moral muscle, intended for a therapeutic purpose. Throughout the history of philosophy, until quite recently, it was thought that the practice of philosophy should have a powerful impact on the philosopher’s life—even, ideally, on the lives of others. So I studied deception and self-deception, how they worked together, why they are so common, what harms they might do, and when, in fact, they may be both useful and necessary. Think, for example, about the misrepresentation, evasion, and self-deception involved in falling in love. Who hasn’t asked, when falling in love, “But am I making all this up?” Erving Goffman would have agreed with the joke—I think we owe it to Chris Rock: “When you meet someone new, you aren’t meeting that person, you’re meeting his agent.”

I was lucky: I was awarded my Ph.D. in 2003, and I got a job. Being part of a university as a professor was very different from being a student, even a grad student. Suddenly you have power. In business—especially in retail—the customer has all the power. But students are nothing like customers, although they are starting to act more and more that way, I’ve noticed, and have eagerly adopted the motto “the customer is always right.” My fellow professors wore their power like a crown. They didn’t feel the need to pull a smile out of anyone.

I was still going from classroom to committee room trying to please everyone. I don’t think it harmed me or anyone else, particularly: It was simply unnecessary. As that sank in, I became disoriented. It reminded me of when I was in St. Petersburg, Russia, in the 1990s, trying to hire the world’s best (and most underpaid) jewelers. No one cared about your money. The concept hadn’t yet sunk its teeth into the post-Communist soul. Similarly, in academe, no one paid much attention to the capital—charm—I was accustomed to spending in my daily life.

In fact, charm could even be a hindrance. In my first year, I was asked by a senior colleague to be the research mentor to a philosopher who had been hired around the same time. After talking about my research, my colleague added, “You are mostly who you seem to be.” This from a man who prided himself on being only who he seemed to be—as though we are all only one person!—and as a way of letting me know that he had “seen through me,” that he “was not prey to my charms.” Also, no doubt he was gently letting me know that I didn’t have to pretend to be someone other than I was.

In my old life, everyone was always trying to be more charming than everyone else—even the gruffness of certain wholesalers was (everyone understood) only pretense, the pose of authenticity, the rough exterior that hid the honest, caring heart. To be charming was among the highest virtues.

But now the chair of a science department at my university—a person whom I like very much, and who is enormously charming—and other colleagues often seem suspicious of charm in anyone. Charm is what you expect from administrators, and they, we all know, are not to be trusted. Administrators are just glorified salespeople who can’t publish (so the story goes). A charming student is a dishonest student, an apple polisher.

If I was a bit rude to people, however, if I acted superior, if I had the right mix of intellectual distance and modest moral disdain, I was suddenly a member of the club. I had to be the opposite of eager to please. Other people must be eager to please me. And if they were, I should be suspicious of them. They should be subservient without being (obviously) obsequious. They can flatter, but never as a salesperson flatters; I want flattery only from my equals. This from people who were regularly checking RateMyProfessors.com to see how many hot peppers they’d earned. Or who fretted—or, still worse, pretended not to fret—about their teaching evaluations.

I got Bob Solomon on the phone again.

“Bob, the professor business is even sleazier than the jewelry business. At least in the jewelry business we were honest about being fake. Plus, when I go to conferences, I’ve never seen such pretentiousness. These are the most precious people I’ve ever met.”

“Come on, Clancy. Did you really think people were going to be any better in a university?”

“Um, kind of.” Of course I did. “And it’s not that they’re not better. They’re worse.”

“Well, you may have a point there.” (Bob was always very tough on the profession of being a professor.) “Focus on the students and your writing. The rest of it is b.s.” (That was a favorite expression of Bob’s, as it is of a former colleague of his at Princeton, Harry Frankfurt.)

“With the students, I still feel like I’m selling.” (I was very worried about this.)

“You are selling. That’s part of what it is to be a good teacher.” (Bob was in the university’s Academy of Distinguished Teachers and had won every teaching award in the book. He also made several series of tapes for the Teaching Company.) “To be a good teacher, you have to be part stand-up comic, part door-to-door salesman, part expert, part counselor. Do what feels natural. Be yourself. Are your students liking it? Is it working for you?”

“Yes.” They liked it all right, maybe a bit too much. “And I think they’re learning.”

“Then forget about the rest of it. Just have fun. That’s the best reason for doing it.”

Stendhal wrote: “With me it is a matter of almost instinctive belief that when any … man speaks, he lies—and most especially when he writes.” I still like to tell a good story. But doesn’t everybody who loves teaching? How else are you going to liven up the classroom when students’ eyes are always turning to their iPhones or laptops?

People often ask me now if I miss the jewelry business. My brother and I rode elephants in the mountains of northern Thailand to buy rubies from the miners. I flew to Hong Kong to buy a rope of gigantic black South Sea pearls—each nearly the size of your thumb—and a precious antique jade bracelet from a dying Chinese billionairess, and flew to Paris two days later to sell it to a customer. I walked through the winding, crowded streets of Jerusalem with my diamond wholesaler, talking about the two-state solution. I stayed at the Four Seasons, the Mandarin Oriental, or private mansions of friends. I lived shoulder-to-shoulder with celebrity clients, flew first class, had my suits custom-made, vacationed in Bali or wherever I wanted. More important—thinking of my life today—I didn’t worry about whether my daughters might have to take out student loans.

And the truth is, a lot of the time, that life was fun. The people were rich, noisy, outrageous. When I opened a new store, I felt like I’d created something special.

Would I go back? Do I miss it? No. Sometimes—I write this looking out my office window at the 100-year-old trees outside, their boughs barely lifting and falling in the autumn wind—I feel like a monk who has retreated from a world that was too much for him. “The greatest part of virtue lies in avoiding the opportunity for vice,” St. Augustine teaches us.

Maybe I’m persisting in a kind of self-deceptive naïveté that Bob wouldn’t have approved of, but you could say that my livelihood now depends on telling the truth. Back then I was arms-and-shoulders deep into life, and now at times I feel as though I am only skating on its mirrored surface. But I’d be afraid to go back. I feel peaceful now. It’s less work to be me, and to have me around. I don’t feel the need to lie. Most of the time.

 


Dr. Martin’s new book on deception in romantic relationships entitled “Love and Lies” is now available.

logo

Je Suis Reviewer #2

Author: Ana Todorović
Original: Musings


I was recently invited to review a manuscript for a journal I follow regularly. The content was right along the lines of my kind of research, and I was happy to accept. I was, of course, Reviewer Number Two.

I always have been, in each of my fifteen-ish reviewing experiences. But this was the first time that the drop-down menu actually encouraged me to be your stereotypical Number Two:

I was in the second year of my PhD when that first reviewing assignment landed in my lap. So there I was, sifting through my inbox, deleting the “Dear Dr. Todorovic” flattery of predatory publishers. But then I hesitated at this one e-mail, because apart from the heading, it lacked the usual telltale signs of spam.

An invitation to review.

I forwarded it to my supervisor. “Oh, that’s a good journal – don’t you know it? I’d accept if I were you.” Wow. Pride and chagrin, all rolled into one.

I kept re-reading that manuscript, re-wording my review, postponing the submission. Should I tell the editor I’m a clueless student, and not Dr. Todorovic? Should I say I’ve never done this before, that I don’t know what I’m doing or why they picked me? That it’s all a big mistake?

In the end I said nothing. I pressed the submit button; the world didn’t implode. Two days later, an e-mail arrived. The other reviewer didn’t do their job on time, the editor thought my concerns were substantial enough to request a major revision. I was mortified.

***

It got easier over time and with some experience, but it never really got easy. I still open the report of the other reviewer, the one that knows what they’re doing, with trepidation. If they caught something I should have, I feel ashamed. If their misgivings align with mine, I’m flooded with relief. If I mention something they didn’t, I worry that I was nitpicking.

As Reviewer #2, I get to see plenty of weak papers in low-impact journals, written in broken English, with poorly described experimental procedures and inconclusive results. It’s very annoying when these come from good labs that write up their other papers, the ones I don’t get to review, with care.

I never know how much to judge and how much to help. I never know if helping will be seen as asking them to write the paper I would have written, the thing Reviewer #2 is notorious for. I never know what to do when I need just a few extra pieces of information to understand the design, before I can decide about the rest. I get frustrated when I don’t understand things, and I worry that this frustration will spill over into my review as pointless vitriol. Another feature of #2.

It’s worse when the journals are good. I can judge whether a design is creative and elegant, and can lead to the claimed conclusions. I can judge whether the analyses are sound. In some cases I will even check whether the numbers in the reported statistics all match up (you’re welcome). But when I have to judge novelty? And whether the wow effect matches the scope of the journal? Good grief, how should I know? Ask Reviewer #1, I can barely keep up with my own narrow topic.

***

Most of the learning from that first review onward was (and still is) a lonely process, with only the other reviewer’s comments as any substantial form of feedback. So every time I hear a gripe about Reviewer #2, I cringe a little on the inside. It’s me, it’s me, and I’m trying to be invisible.

I don’t think we should stop grumbling about Reviewer #2, I’m a big fan of complaining. But maybe, just maybe, a little bit of structured guidance would help? Someone to show us how to be kind but decisive. To tell us to always list strong points, then voice our misgivings as suggestions for improvement. To consider whether the experiment is something others would care to know about before we rip it apart.

I had a supervisor who showed me the ropes, but this shouldn’t be left to individual group leaders. We’re all in this together, both causing the damage and taking it. Instead of throwing young researchers into it head-first, maybe we can teach them, and make reviewing a more user-friendly experience.

“Hi, my name is Ana and I will be your reviewer tonight.”

Can’t Disrupt This: Elsevier and the 25.2 Billion Dollar A Year Academic Publishing Business

Author: Jason Schmitt
Original: Medium


Twenty years ago (December 18, 1995), Forbes predicted academic publisher Elsevier’s relevancy and life in the digital age to be short lived. In an article entitled “The internet’s first victim,” journalist John Hayes highlights the technological imperative coming toward the academic publisher’s profit margin with the growing internet culture and said, “Cost-cutting librarians and computer-literate professors are bypassing academic journals — bad news for Elsevier.” After publication of the article, investors seemed to heed Hayes’s rationale for Elsevier’s impeding demise. Elsevier stock fell 7% in two days to $26 a share.

As the smoke settles twenty years later, one of the clear winners on this longitudinal timeline of innovation is the very firm that investors, journalists, and forecasters wrote off early as a casualty to digital evolution: Elsevier. Perhaps to the chagrin of many academics, the publisher has actually not been bruised nor battered. In fact, the publisher’s health is stronger than ever. As of 2015, the academic publishing market that Elsevier leads has an annual revenue of $25.2 billion. According to its 2013 financials Elsevier had a higher percentage of profit than Apple, Inc.

Brian Nosek, a professor at the University of Virginia and director of the Center for Open Science, says, “Academic publishing is the perfect business model to make a lot of money. You have the producer and consumer as the same person: the researcher. And the researcher has no idea how much anything costs.” Nosek finds this whole system is designed to maximize the amount of profit. “I, as the researcher, produce the scholarship and I want it to have the biggest impact possible and so what I care about is the prestige of the journal and how many people read it. Once it is finally accepted, since it is so hard to get acceptances, I am so delighted that I will sign anything — send me a form and I will sign it. I have no idea I have signed over my copyright or what implications that has — nor do I care, because it has no impact on me. The reward is the publication.”

Nosek further explains why researchers are ever supportive by explaining the dedicated loyal customer base mantra, “What do you mean libraries are canceling subscriptions to this? I need this. Are you trying to undermine my research?”

In addition to a steadfast dedication by researchers, the academic publishing market, in its own right, is streamlined, aggressive, and significantly capitalistic. The publishing market is also more diverse than just the face of Elsevier. Johan Rooryck, a professor at Universiteit Leiden, says, “Although Elsevier is the publisher that everybody likes to hate, if you look at Taylor & Francis, Wiley, or Springer they all have the same kind of practices.”

Heather Morrison, a professor in the School of Information Studies at the University of Ottawa, unpacks the business model behind academic publisher Springer and says, “If you look at who owns Springer, these are private equity firms, and they have changed owners about five times in the last decade. Springer was owned by the investment group Candover and Cinven who describe themselves as ‘Europe’s largest buy-out firm.’ These are companies who buy companies to decrease the cost and increase the profits and sell them again in two years. This is to whom we scholars are voluntarily handing our work. Are you going to trust them? This is not the public library of science. This is not your average author voluntarily contributing to the commons. These are people who are in business to make the most profit.”

Should a consumer heed Morrison’s rationale and want to look deeper into academic publishers cost structure for themselves one is met with a unique situation: the pricing lists for journals do not exist. “It’s because they negotiate individually with each institution and they often have non-disclosure agreements with those institutions so they can’t bargain with knowing what others paid,” says Martin Eve, founder of the Open Library of the Humanities.

In addition to a general lack of pricing indexes, the conversation around the value of a publication is further complicated by long-term career worth. David Sundahl, a senior research fellow at the Clayton Christensen Institute for Disruptive Innovation, says, “We actually understand how money passed through to artists who wrote music and authors who wrote books — but it is not clear how the value of a publication in a top tier journal will impact someone’s career. Unlike songs or books where the royalty structure is defined, writing a journal article is not clear and is dependent not on the people who consume the information but rather deans and tenure committees.”

Disruption Doable?

It is precisely the prior lack of a pricing and value barometer that leads to the complexities associated with disrupting the main players in academic publishing. “Adam Smith’s invisible hand works to lower prices and increase productivity but it can only do so when valuation or pricing is known and the same thing is true for disruption. If you don’t know how to value something, you actually don’t have tiers of a market,” says Sundahl.

If a disruptive force was to significantly change academic publishing it needs to happen in a market that is currently underserved or undesirable by the large-scale publisher. “Disruptive innovation is usually driven by a group who can’t afford to build something that is as big, fancy and sophisticated as the existing solution — they then have to find a market where either people don’t have anything available to them or they are satisfied with something less than perfect,” says Sundahl.

Should academic scholarship keep existing in a similar trajectory as in the past decades Sundahl finds incumbents (existing big publishers) almost always win when competition takes place along those sustaining strategy lines. “To revolutionize academic publication, a new system would need to be developed in a basement market which would eventually enable people to gain enough credibility doing this new solution. People would then begin to value this lower end, well done research, and that is when the world starts to change,” says Sundahl.

The prior is exactly what large entities like the Bill and Melinda Gates Foundation or perhaps even top tier research one (R1) universities can’t do. “They have to play the game the way the winners are already playing it. Incumbents almost always win under those conditions,” says Sundahl. And to further complicate matters, junior colleges and community colleges, which perhaps would represent fertile grounds to be served by a newer, “basement market” entrant, may be less likely to spearhead this new outlet themselves due increasing government constraints focused nearly exclusively on job placement and starting salaries in lieu of a research-based, theoretical curriculum.

Open Access Packs a Punch

Driven by the lopsided power structure the move toward open access and the unrestricted access to academic information has been exponentially growing. Perhaps it is, itself, a “basement market” for leveling the academic publication environment and creating a market where respect and credibility can be fostered, grown and transitioned into the existing academic prestige, merit, and tenure conversations.

“The open access environment is one of the more fertile environments for people to be thinking: if we don’t like the old way, what should the new way look like,” says Heather Joseph, executive director at the Scholarly Publishing and Academic Resources Coalition (SPARC). Joseph finds that the quantifiable numbers of open access journals speak for themselves and says, “You can look at the number of strictly open access journals if you look at the Directory of Open Access Journals (DOAJ). When it started tracking open access journals there were a few dozen and now they list over 10,000 open access journals.”

The push toward open access is not only growing in sheer numbers of journals but also in an increasingly confrontational strategy that academics leverage against large publishers. “At the moment, the Netherlands, the whole country, has said to Elsevier that we want all of our researchers to be able to publish open access in your journals at the same rates we would pay for a subscription last year and if you can’t do that we’re going to cancel every one of your journals, for all of our universities nationwide,” says Eve. “They have a few days left to resolve this, and it looks like they are going to cancel all the Elsevier journals.”

Rooryck found his recent very public decision to step down and move his Elsevier journal Linga to open access met with complete support from the other six editors and 31 editorial board members. “The process went very easily. We were all aware of the pricing and Elsevier’s practices and within a week everyone agreed to resign,” says Rooryck. Eve’s platform, the Open Library of Humanities, will now house the new open access iteration of Lingua, which will be called Glossa. Eve says, “Right away it is 50% cheaper to run it through us then when it was with Elsevier. So anybody subscribing to it already sees 50% more revenue.”

Rooryck finds the move toward broad open access a natural progression and says, “The knowledge we produce as academics and scientists should be publicly available in the same way we have a company that delivers water to our faucets and electricity to our home. These are things we have a right to. Public knowledge and education is a human right and it should not come with a profit tag of 35%.”

Although it appears open access has the ability to simultaneously diffuse academic knowledge to a larger body of readers and cut costs significantly, many feel that the for profit academic publishers are still situated to continue into the near future. Joseph says, “I think the play for most smart commercial publishers is to try to preserve the current environment for as long as they can: delay the policy changes, delay the culture changes and to be working on things like tools and services applying to aggregation of data, where they are then embedding themselves more deeply in the workflow of researchers and becoming essential to researchers in a different way.”

“If you are no longer essential to researchers in the, ‘you have to publish in my journal in order to get tenure and promotion’ what do they replace that with? I think the smart publishing companies like Elsevier, like Springer, who are very smart in that regard, have been thinking about where they can go to be playing a role of continuing to be seen as essential by the research community once they are no longer playing the role of providing assessment,” says Joseph.

Onward and Upward

“In the US Congress we have been finally making progress with the Fair Access to Science and Technology Research (FASTR) bill. It moved through the committee it was referred to in the Senate and is poised to move out of the Senate and potentially be considered by the House and hopefully pass. Ten years ago, I would have said we didn’t have a chance to do a stand-alone bill,” says Joseph.

Perhaps the recent congressional support Joseph refers to is one more verifying measure that the majority of articles will be moving toward an open and accessible framework. Many in the academic community hope that this government support signals the reprioritization of a research framework and the switching of the guard. And while the prior is extremely important, others in the academic community are hoping to grow “basement markets” from the ground up.

The Center for Open Science, which provides seed funds to startups in the academic scientific research space, is led by Nosek and focuses on aligning scientific values to scientific practices. “The open science framework is just a means at connecting all the research services that researchers use across the entire research life cycle,” says Nosek.

Nosek is optimistic about the evolution of technology in open science and says, “There are a lot of startups going at different parts of the research life cycle. Whether it is publication and what a publication means, or looking at full articles and whether you can make articles convey information in smaller bite size pieces.” Nosek tells me that there are so many solutions happening in research right now and mentions it is hard to judge what the true solutions will look like. “I sometimes think some of the ideas haven’t a chance, but what do I know? I could be completely wrong about it. And that is the whole point — do some experimentation and try out ideas. And the fact is there are a lot of people who see what the problems are and have a unique sense of a potential solution — it is a very lively time to try out different answers.”

Time will tell if open access will be the needed disruption to allow the academic environment to right itself or if a new market emerges from startup incubators like the Center for Open Science. Regardless of how the future vision is realized, most in the academic community hope that the new iteration of scholarly articles and publishing will do more good toward humankind than that of a hefty profit margin.

logo

Academic Scattering

Author: Katie Mack
Original: Research Whisperer


A couple of years ago, I was gathering my things after a seminar at a top physics research institution when I overheard two of the senior professors discussing a candidate for a senior lectureship.

Professor A was asking Professor B if the candidate had a partner, which might make him less able to move internationally.

Prof B replied, happily: “No, he has no family. He’s perfect!”

I doubt any selection committee would admit on-record to thinking a family-free candidate is “perfect”. Nonetheless, the traditional academic career structure is built around an assumption of mobility that is hard to maintain with any kind of relationships or dependents. I’m still trying to figure out if I can manage to keep a pet.

Right now I live in Australia, working as a postdoc in Melbourne. My first postdoc was in England. Before that I was in grad school in New Jersey, and I was an undergrad in my native California. Halfway through grad school I studied for a year in England. I’ve done two- or three-month stints in Japan, Germany, Australia and the UK. Each of these moves or visits has been, while not strictly required, extremely helpful for my career. And in a field where competition for jobs is so fierce, if you want any hope of landing that coveted permanent academic job, how many of these “helpful” moves can you really consider optional? If mobility is such an advantage, how does having a family or a partner affect your chances?

A couple of months ago, Slate published an article with the headline, “Rule Number One for Female Academics: Don’t Have a Baby.” The point of the article wasn’t actually to discourage women in academia from having children (though backlash from the community may have contributed to the change in title to the somewhat vague, “In the Ivory Tower, Men Only”). The article provided statistics and anecdotes to illustrate how having children, or being suspected of the intent to have children, could harm a woman’s progress in academia – from the necessary pause in research output, to the unconscious or explicit biases that act against “working mothers” but have no similar effect on “working fathers”. Personally, I found the piece deeply disheartening, but my dismay was of a somewhat detached variety. In order to worry about the effects of having children, one has to be in a position where that seems like even a remote possibility. As a single woman with a short-term contract and no idea which hemisphere I’ll be in two years from now, children are not exactly at the forefront of my mind. At the moment, I spend a lot more time thinking about the two-body problem.

In this context, the “two-body problem” is the problem of maintaining a committed relationship between two individuals who are trying to have careers in academia. When the two-body problem proves unsolvable, it’s sometimes called “academic scattering”. It is by no means unique to academia, but the international nature of the field, the frequency of short-term (1-3 year) contracts, and the low wages compared to other similarly intense career paths make it especially bad for academics. In the sciences, the gender disparity adds a further complication for female academics: when women make up a small percentage of the discipline, they are much more likely to be partnered with other academics.

Of course, solving the two-body problem is not impossible. I have many colleagues who have done it, either through spousal hires, fortuitous job opportunities, extended long-distance relationships, or various degrees of compromise. It takes sacrifice, luck, and, often, institutional support. But couples just beginning a relationship while building two academic careers might find the odds stacked against them. Even ignoring for a moment the fact that a no-compromise work-obsessed lifestyle is still considered a virtue in many institutions, academic careers are structurally best suited to people with no relationships or dependents, who travel light and have their passports at the ready.

It varies by field, but for physics and astronomy, a “typical” tenure-track career path looks something like this: 4-6 years in grad school, a postdoctoral fellowship for 1-3 years, then usually another (and maybe another), all followed by a tenure-track or permanent job, which may or may not be the job you end up in for the long-term. There’s no guarantee all these steps will be in the same country – very often they are not. For me, it’s been an international move every time so far, and it’s very possible the next one will be, too. When I took up my first postdoc, I left my country of origin, most of my worldly possessions, all my friends and family, and a committed relationship, to start all over in England. When I took up my second postdoc, I left my newly built life in England and another committed relationship to start all over yet again on the other side of the world. I’ve moved internationally several times chasing the prospect of permanent academic employment. I have yet to convince anyone to come with me.

I’m not trying to convince anyone that avoiding academia or refusing to move around the world is the key to solving all relationship problems. Anyone can be unlucky in love, even if they stay in the same city their entire lives. But academic shuffling is particularly hostile to romance. The short-term contracts mean that when you arrive in a new country, if you’re interested in finding a long-term partner, you have something like two years to identify and convince a person you’ve just met to agree to follow you wherever you might end up in the world, and you won’t be able to tell them where that will be. If you happen to have different citizenships (which is likely), you have to take into account immigration issues as well – your partner may not be able to follow you without a spousal visa, which can mean a rather hasty life-long commitment, or, depending on the marriage laws of the country in question, a total impossibility. I had a friend in grad school who, at the end of her PhD, faced a choice between living with her wife in Canada, and becoming a tenure-track professor at one of the most prestigious research universities in the USA.

The timing doesn’t help, either. The postdoc stage, when you’re doing your best impersonation of a human pinball, usually comes about in your late 20s or early 30s. It’s a time when it seems like all your non-academic friends are buying houses, getting married, having babies, and generally living what looks like a regular grown-up life. Meanwhile, chances are you’re residing in a single room in a short-term rental, wondering which country you’ll be living in next year. If you’re a woman, you might be keeping an eye on the latest research on fertility in older mothers, and mentally calculating how long you actually need to know someone before deciding to reproduce with them, because by the time you’re in one place long enough to think about settling down you’ll be, at best, pushing 40.

There are lots of ways to make it all work out, of course. You could refuse to date other academics, and instead make sure you’re spending enough time on hobbies outside of the university to attract someone’s interest, while making sure you have a REALLY good pitch about the joy of imminent mystery relocation. You could date another academic, and resign yourself to a relationship that will probably be long-distance for far longer than it was ever face-to-face, with no guaranteed reunion in sight. For this option, make sure that you have lots of disposable income for plane tickets and that neither of you is committed to spending too much time inside a lab. You could swear off serious dating altogether until you’re getting close to landing a permanent job, then negotiate with your future employer for a spousal hire, with the necessary career compromise that will be required of one or both of you to be at that particular institution.

Or you could just wait till you’ve scored a permanent faculty job somewhere, probably in your mid-to-late 30s, and (if you’re a woman) hope that you meet someone soon enough that starting a family is still an option. (As a side note, my late-thirties single straight female friends tell me that men who want babies won’t date women over 35. Obviously this is an unfair and unscientific generalization, but the point is that there are societal pressures that women face when they choose to put off the prospect of families until they have permanent jobs.) If you choose this option, you might also want to keep in mind that a tenure-track job isn’t necessarily permanent, and having a child before having tenure is one of those options that the aforementioned article had a few things to say about.

Or you could decide to prioritize where you want to be (or who you want to be with), and, more likely than not, end up severely limiting your career progress and/or leaving academia altogether. If one or the other partner does have to make a big career sacrifice, gender norms will suggest that, if you’re a woman, the one to make the sacrifice really ought to be you.

As for me, I confess I haven’t figured it out. I have two years left on my contract in Australia and no idea whatsoever which country I’ll end up in next. I’m applying broadly, and there’s no guarantee I’ll have a choice about location if I want to stay on the path toward becoming tenure-track faculty at a major research institution. When it’s not unusual for a single postdoc job to have 300 applicants, and faculty jobs are even more selective, getting even one offer is considered a huge win.

I don’t know if there’s a solution. Having a pool of early-career researchers who move frequently to different institutions unquestionably advances research and keeps the ideas flowing. It is also usually great for the development of postdocs’ research abilities, exposing them to new ideas and work styles. But the prospect of a nearly decade-long period of lifestyle limbo between graduate studies and the start of the tenure track is, understandably, a significant discouragement to many fine researchers who might otherwise bring their unique insights to the field. And, statistically, more of these lost researchers are likely to be women. It may not be the dominant force keeping women out of science or academia, and it may not affect all women, but any slight statistical skew that disadvantages women more than men contributes to the inequality we see. And that makes academia a little bit more lonely for everyone.

logo_square

How I Lost my Voice: On Anonymity and Academic Blogging

Author: Psyc Girl
Original: Stressful Times


I haven’t been blogging much lately. And yes, these navel gazing posts about blogging and voice can be really irritating. For those of you who don’t want to read further, here is the TL;DR version:

I miss blogging, but I don’t know who my voice is anymore, and I miss true pseudonymity. I’m not sure what to do about this. 

A big part of the slowdown is the fact that several people I know IRL read my blog now – that means when I write I picture them sitting at home in their track pants reading my inner thoughts and feelings. Things that I might or might not want to share with actual people I interact with face to face on a regular basis. It’s like being tapped into my frontal lobe and I’m not sure how I feel about that all of the time.

There are also readers who know who I am from the blog – they don’t interact with me in a face to face manner on a regular basis necessarily, but they could follow the rest of my life online if they wished. And I follow their lives. All of these things I don’t really get that bothered by (I don’t have a lot of choice in the manner even if I was bothered, let’s be honest) but it does censor my writing, compared to 5 or 6 years ago when I was literally sending pseudonymous words completely into the unidentified blogosphere.

What if someone who reads the blog outs me at work by accident? What if someone decides they don’t like me anymore and they out me on purpose? What if that angry post from 3 years ago comes back to haunt me because someone is upset with me? I can’t help but think these things.

There is also the delicate balance between blogging the details of my “real life” vs. my pseudonymous, general writing. For example, early on I made the decision to never blog about my actual research area. Thus, Agricultural Psychology was born – I focus on cow studies, corn growth, and chick studies. These give me helpful labels with which to communicate the overall process of being a researcher – but it limits the exciting things about my research I can really discuss.

I also try my best not to blog about something I’ve said in person to strangers who could accidentally stumble across the blog (who I don’t want to). If I make a joke on twitter, I don’t make the same joke in class. If I tell a story to a colleague, I don’t tell it on the blog. And vice versa. Sometimes this level of censure gets difficult – but as I develop a more offline social network of professional support, I find myself preferring to turn to them in person vs. the blogosphere, if I have to choose.

Third, there are things I want to blog about that are very important academic causes to me – but doing so would make me more identifiable.

Both of these last two points relate to the idea of whether or not pseudonymity remains important to me. It gives me a great deal of freedom in my writing. And as much as people can say academic blogging is dying, I disagree – I know of 2-3 academics who, in the past year, switched from identifiable to pseudonymous blogging so they could have more freedom in their voice. And I know identifiable bloggers who will not touch many topics on their blog because they cannot (for whatever reason) attach their name to their opinions.

What do I want? I don’t know. I could remain the same. I could lock down any additional disclosures of my identity. I could be more loose with my pseudonymity and not care if I’m identifiable. I could blog under my name. I have been wrestling with all of these options since I became a faculty member – the only thing I seem decided upon is not blogging with my real name displayed. There are many topics I no longer write about – teaching, graduate student mentorship, my medical concerns, my love life – because I no longer have complete pseudonymity.

Fourth, I realized about 12-18 months ago that I was spending more time online than I wished. I wanted my focus to turn to my offline life. It was hard to really take a look at how much time I was spending attached to a screen and to let go of some of the online interpersonal connections that were very important to me. Twitter, in particular, has a short memory – after a year of less activity on twitter, including deleting twitter from my phone, I have much fewer interactions online. I miss them. But my offline life is a lot more consistent with the way I want it to look. I’m sad to have made one sacrifice for the benefits of the other. Less time online = less blogging.

Last, my career has changed. I still have issues and experiences related to academia that I think are helpful to others – I have no intention to quit blogging altogether – but I’m spending a lot more time crafting my career lately. I’m removing things, adding things, and reconstructing things. I suppose one could argue that I’m welcome to blog about those experiences. I’m finding, instead, I need to do a lot of these changes alone. I’m not discussing them with others, but instead quietly reflecting and tweaking solo.

To summarize: I don’t quite know where my blog is going. I know that I want to be blogging more. I think as my career changes, the things I have to share change. But my personal life will probably stay in the closet, at least until I figure out which voice to wear.

What Parents Need to Know About College Faculty

Author: Joseph Fruscione
Original: PBS NewsHour


It was a nice spring day in 1999 — my second semester of teaching. I was walking past a campus tour group and saw one of my students leading it. The timing couldn’t have been more perfect: as I was passing them, a parent asked if all university faculty were full time. “Yes,” my student said. I was taken aback, because I’d told my classes about being adjunct, as well as a bit about what “adjunct” meant and how many of us there were in the English department alone teaching freshman writing.

The next day, I pulled him aside after class and asked him about it. “I’m not mad at you; I’m just curious: Your class knows I’m a graduate student, not a full-time professor with tenure. I don’t even have my doctorate yet. Why did you tell that parent all university faculty were full time?”

“That’s what the university wants us to say to parents,” he replied.

This is one of many moments in my career I’d like to revisit with the knowledge and dedication to activism I have now. Granted, things have improved a bit since then: another former student told me in 2013 that tour guides tell parents the school employs a variety of professors, and that some of them teach at other schools. This is slightly better, but still not ideal at a school whose tuition is among the highest in the country — yet whose senior administrators receive CEO-level compensation.

I’d love to visit as many colleges as possible during spring tours and back-to-school move-in time. Fifteen years of adjuncting — six as a Ph.D. student, nine as a scholar and hopeful jobseeker — has given me a lot of rich, sometimes troubling experiences that I’d want to share with parents and students. I wish, for example, a parent was with me that time I was in the elevator with my department chair, who quasi-complained about having to teach one class in a semester when I had four first-year writing courses across two schools. (To be clear: I’m not questioning how much work and how many demands chairs have. I’m questioning the myopia of this person’s comment.) “I guess I shouldn’t complain about this to you,” the chair said. “Yes, but it’s fine,” I said. “I manage, and I’m gaining good experience.”

Had some parents been with me, I could’ve added this: “Maybe you can explain to them why the university thinks it’s good to give students — especially freshmen — a string of part-time professors who may be teaching at other schools to make ends meet. Can you or one of the provosts meet with their children while I’m teaching somewhere else to approach a livable wage?” At the time, I was playing nice because I’d hoped (naively) that I could move up the ranks in the department to a full-time position. Perhaps I should’ve damned the torpedoes and just spoken my mind. Playing nice rarely helps adjuncts move up at any school.

We are all a part of higher education’s culture of contingency, regardless of whether we’re students, parents, staff members, graduate TAs, administrators, professors, former academics, and so on. The precarious working conditions on American college campuses mean that adjunct and other non–tenure track faculty must often choose between their desire to teach and their desire to deal with the financial realities of what is, fundamentally, full-time part-timing. In such cases, students suffer when their adjunct professors have to curtail office hours, spend more time traveling between campuses than preparing lectures, grade and comment on writing assignments when they have 70-80 (or more) additional students across several campuses, and otherwise splinter their time and attention.

If you’ll indulge me, parents, I’d like to assign some tasks to any of you both interested in learning more about your children’s schools and willing to help change American higher education. (After 15 years of teaching, I apparently can’t resist assigning homework.) Ultimately, you have every right to know exactly where your tuition dollars are going, how university administrations and policies are harming your children’s learning conditions, and how your children’s teachers are not always treated professionally and equitably.

Want to be more active and engaged in helping improve your children’s college experiences? Think your voice needs to be heard? Here are some simple yet effective ways to get started:

  • Help my fellow advocates and me petition David Weil, the administrator of the Wage and Hour Division at the U.S. Department of Labor, to investigate higher education. (I wrote more about our petition in this previous Making Sen$e post.) We hope to reach 10,000 signatures by Labor Day.
  • Read (and then share) these recent pieces by John Warner and Mary Grace Gainer about what you should know and you can do.
  • Encourage your children to know more about the contingency of their adjunct professors, as well as how that status affects their learning environment. Remind them that financial necessity leads many of their professors to teach at other schools, and perhaps do extra tutoring and editing on the side to make ends meet.
  • Ask your children to contact the university newspaper about writing stories or op-eds about their adjunct professors. (If they need a model, have them read this nice piece from a freshman at my former school.)
  • When your school is trumpeting the new facilities (but perhaps not telling you about those for their leaders), the newly hired star professors (who probably won’t teach undergrads), and view campus “business” as that of a luxury cruise (huh?), ask instead about the working conditions and job stability of their non-tenure track faculty — i.e., the likely majority of professors your children will have. (You’ll get some sample questions in a second.)
  • Follow the advice from other professors (see below) about what you need to know, what you can do, and how you can do it. Learn more about these and other writer-activists dedicated to improving American higher education. Share their work with other concerned, tuition-paying parents whose children might be facing record levels of student debt after they graduate.

The next time you’re on campus, you can ask someone in charge — dean, provost, admissions director, and so on — questions like these:

  • What percentage of your faculty are adjuncts? Approximately how many of your faculty have to teach at other schools?
  • How much do you pay adjuncts per course? How do adjuncts’ salaries compare to those of full-time tenured or tenure-track faculty?
  • How many, if any, tenured professors teach first-year students?
  • What are the salaries of the school’s upper-level administrators, and how many (if any) courses will they teach this year?
  • How is there funding to install posh new facilities or pay star professors who probably won’t teach my freshman, yet not enough to pay the majority of our children’s professors a living wage or give them meaningful full-time positions?

You can also ask campus tour guides these questions, but remember most of them are undergraduate students making a little extra money; they’re not the ones remaking higher education in a corporate, almost anti-intellectual image. If anything, they’re victims of the new college campus, not the creators of it.

Even if you’re not visiting campus, you can call or email the school. You might get the truth. You might not. You might get some spin or adminspeak about “valuing all faculty equally,” “financial realities,” and “some faculty teaching at multiple schools.” If you contact your schools, take notes about whom you’re speaking with, what he or she says, and so on. Then, let me know what happens, so I can write a follow-up piece. When more of you start asking these kinds of questions, university administrations will realize that their actions to undermine higher education are not going unnoticed.

I’m far from the only person eager to talk with you. I asked my professional network what they’d most like to say to parents. I got some smart, wonderful responses:

Natalie Dorfeld: I’d ask them how they would feel if they knew some of their children’s professors were on food stamps.

Brianne Bolin: I’d tell them that at my school, 78 percent of classes are taught by adjuncts who get 8 percent of the extortionist tuition that they’re shelling over. I’d also ask them if they were more concerned with an education or a piece of paper.

Debra Leigh Scott: Don’t blindly send your children to college without informing yourself about the corporatized university of 2014. Don’t apply to colleges without getting the real numbers of adjuncts who will be teaching your children. Know that the universities lie about this. Look at the adjunct-run lists, and get the numbers and the details from somewhere other than the universities themselves.

Desirée Sunshine: Don’t go into debt. If you can’t pay as you go, it’s not worth it.

Gordon Haber: Parents should lock their kids in the basement rather than let them attend for-profit colleges.

Miranda Merklein: Do not send your kids to schools with a pattern of low-wage contract labor, budget cuts to faculty (reduction in costs to instruction), and tuition increases. That pattern demonstrates a lack of concern for education.

Amy Lynch-Biniek: Ask about labor conditions; insist that working conditions equal learning conditions.

Seth Kahn: Make the effort to understand contingency. Know the differences among different kinds of academic jobs. Senior administrators (president, provosts, chancellors and deans) and faculty are very different; there are ranks of faculty even within tenure track, and those titles mean some concrete things.

Melissa Bruninga-Matteau: Absolutely know the ratio of tenure-track faculty to adjuncts, and ask what percentage of classes are taught by non–tenure track faculty, including if there are any grad students teaching classes.

Amy Leggette: Discuss the purpose and expectations of higher ed: i.e., is it job preparation? An “experience”? Or something else?

“Parents need to know they are getting state-of-the-art stair machines instead of well-compensated professors.”

Professor Never had even more to say: As a parent of a rising high school senior, I have found touring colleges with my son a sort of revolting experience. I wasn’t the rebel I’d planned to be on the tours as I found the all-smiles-come-to-our-college/resort atmosphere sickening and oddly oppressive. While I was disappointed in myself for not doing a better job of educating the other parents on the tours with me, I take every opportunity I can to educate all the other parents I know who have kids of similar age about how universities are spending their money. Parents need to know they are getting state-of-the-art stair machines instead of well-compensated professors. They need to know they’re getting luxury dorms instead of professors who have office space and health care. They need to know most universities care more for attracting students than they do about educating them.

Touring colleges should be a lot of things, but “a revolting experience” should never be one of them. (Ever.) These pieces of advice are the proverbial tip of the iceberg. Remember, this comes only from people in my social network. Surely, the tens of thousands of non–tenure track faculty across the country would have more to say. The more you encourage your children and fellow parents to follow your lead in asking tough questions, the more American higher education can change.

Parents: college students and faculty need you on their side if higher education is going to change. Know where your tuition dollars are going. Know more about how the “budget shortfalls” at your children’s schools affect their learning but not senior administrators’ bloated salaries. Know also how many of these administrators are making efforts to further erode professors’ job stability and academic freedom through restrictive social media policies, rejections of tenure cases, and controversial decisions to rescind job offers. Ask the questions your children’s schools may not want to hear.

You can help, parents. A lot. College semesters are starting up again. You might be on campus to help your son or daughter move in, or you might be figuring out how tuition payments are affecting your yearly budget. Either way, you can help fix higher education by following the above advice, asking questions, and otherwise taking an active role in understanding the truth about your children’s education. Students and faculty are on your side; they want you on theirs.

On Student-Shaming and Punching Down

Author: Kevin Gannon
Original: The Tattooed Professor


A few years ago, trapped in the midst of final exam grading, I started posting some of the real howlers I got as answers on Facebook. I didn’t use students’ names, and I don’t “friend” students on FB, so this sort of venting seemed like an OK way for me to keep my sense of humor during the end-term crush.

I have felt guilty about doing that ever since.

Now, I vent plenty on Facebook and (especially) Twitter. PLENTY. But I deploy my snark laterally, or upwards–not down. Not any more. If I am the advocate for teaching and learning that I say I am, then I need to walk the walk. If I argue that failure is not a defeat, but something on which to build successes, then how can I use others’ failures as fodder for cheap laughs?

When I was doing my Ph.D. work, our department had a graduate lounge for our exclusive use, and I used it plenty. Frequently, a certain one of my fellow Ph.D. students would come into the lounge after leading a discussion section and, without fail, just go full blast on his students. THEY DON’T KNOW ANYTHING! THEY CAN’T WRITE! THEY DON’T UNDERSTAND HISTORY! And then he’d get personal. “Student X is a slack-jawed yokel,” that type of stuff. And I would think: Dude, if you’re that cynical now (we were both in our mid- to late-twenties), then I want no part of you when you’re forty.

Facebook and Twitter didn’t exist then; hell, the internet was still fairly novel. But I imagine that guy, and others like him, probably LOVE the “Dear Student” series done by the Chronicle of Higher Education on its Vitae site (which is geared toward job-seekers and grad-school, early-career academics). And, to be sure, some of the behaviors in these columns’ sights might look like easy targets–just like the laugh lines in those student final exams I decided to publicly make fun of back in the day. However, it’s one thing to vent by trading stories and frustrations among trusted friends and colleagues. It’s another thing altogether to vent to vast swaths of the internet. And when it goes beyond venting, there’s a real problem. The “Dear Student” columns are mean. They punch down. They inflate the pedantic into the problematic, and then humiliate rather than empathize. And I’m certainly not the only one who has this reaction; yesterday, Jesse Stommel wrote a magnificent and eloquent essay on why “Dear Student” is such an awful idea. The entire piece is a must-read, but his point about the climate this type of student-shaming work creates is worth repeating:

Everyone that comes into even casual contact with Vitae’s “Dear Student” series is immediately tarnished by the same kind of anti-intellectual, uncompassionate, illogical nonsense currently threatening to take down the higher education system in the state of Wisconsin…Giggling at the water cooler about students is one abhorrent thing. Publishing that derisive giggling as “work” in a venue read by tens of thousands is quite another. Of course, teachers need a safe place to vent. We all do. That safe place is not shared faculty offices, not the teacher’s lounge, not the library, not a local (public) watering hole. And it is certainly not on the pages of the Chronicle of Higher Education, especially in Vitae, the publication devoted to job seekers, including current students and future teachers.

He’s absolutely right. As one who has been particularly concerned with the (mis)uses of power in academic settings, Stommel’s admonition hit home for me. He put into words much better than I could have why I still feel guilty about my previous Facebook venting.

Again, this doesn’t mean the end of snark and sarcasm. But punch up, not down. Powerful tenured professor berating students or misusing his power to make life tough for female, LGBT, or African American faculty? There will be richly-deserved snark. Political leader who adapts a belligerently ignorant stance to justify depriving others of basic rights? You will be roasted on Twitter, and I will applaud and retweet. But calling out students–giving examples of their mistakes or missteps? No. As educators, we are the ones with the power. Student foibles are temporary. Our reactions to those foibles can be permanent–for both us and them.

Consider the following Twitter feed:
Twitter12 Twitter5Twitter3Twitter4Twitter9Twitter8Twitter1Twitter11Twitter7

 

All of these, actually, represent some of the “highlights” of my own undergraduate career. If my professors had been on Facebook or Twitter, and thrown these out on the internet (and it’s not like any of this crap I did was in private), what would have happened if I saw or heard about this “venting?”

Would I have gotten it together and kicked ass in my (second) Senior Year?

Would I have believed what some of my professors told me, that I should try for graduate school?

Would I have gotten in to a Master’s program, then completed it, then gotten into a Ph.D. program with a fellowship?

Would I have asked for the help I needed to address my increasingly deteriorating “lifestyle choices?”

Would I have been lucky enough to be in a position like I am now, where I can teach teachers and students? And in doing so, experience daily growth myself?

I doubt it.

I don’t like shame. I run and hide from what makes me ashamed, and do my level best to stay hidden.

I don’t know if my professors joked about me at the coffee pot, or traded stories about me at cocktail parties. But I do know that they took an interest in helping a student who was trying to get his act together. I do know that they helped build academic confidence for a student who may not have always been receptive to that help. I do know that they offered advice, perspective, and support–as well as references, recommendations, and cheerleading–to a student who wanted to pursue their field of study at the graduate level. I do know that they did this even at the times when I didn’t look or act as grateful as I truly was.

The simple truth is that I am where I am today–in all senses of the term–in part because others did not shame me for the things about which I was already ashamed. I was the “Dear Student” who the Vitae series has dead in its sights. What might we lose tomorrow as a result of shaming today? What do we do to ourselves, our colleagues (present and future), and our students if we revel in punching down at folks who may not even know they’re targets? What–WHO–gets damaged?

We all do.

So, I humbly offer a revised column:

Dear Student:

You’ll get better at this. So will we.

Faculty (a.k.a. former students)


 

You do not Need to Work 80 Hours a Week to Succeed in Academia

Author:
Original: Dynamic Ecology


There is a persistent myth (some might even call it a zombie idea) that getting tenure in academia requires working 80 hours a week. There’s even a joke along the lines of “The great thing about academia is the flexibility. You can work whatever 80 hours a week you want!” The idea that you need to work 80 hours a week in order to publish or get grants or tenure is simply wrong. Moreover, I think it’s damaging: I hear routinely from younger folks (often women) who are seriously considering leaving academia primarily because they think that a tenure track position will require working so much that they wouldn’t be able to have any life outside work (including raising a family)*. So, this is my attempt at slaying the zombie idea that succeeding in academia requires working as much as an investment banker**.

This post was inspired by this comment from dinoverm on last Friday’s linkfest post, where I linked to the “7 Year Postdoc” article, even though I had already linked to it earlier, because I found that it kept coming up in conversations with grad students, postdocs, and new faculty. In linking to it on Friday, I said, “I really like the idea of deciding what you are okay with doing (maybe you aren’t willing to move anywhere in the country/world, or you really want to do a particular type of research but aren’t sure how “tenurable” that line of work will be), and then using that to set boundaries on what you do as a faculty member. I think this perspective is really valuable for people who are considering stepping off the tenure track primarily because they’re worried about work-life balance or quality of life. Obviously getting tenure will require working hard, but the lore that it requires 80 hour work weeks and ignoring one’s non-work priorities is simply wrong, and I think this perspective is a good one for thinking about how to balance things.” That led to discussion in the comments on how it is rare for someone to “admit” to not working 80 hours a week. This is something that we’ve discussed in the comments before. (Thanks to Jeremy for figuring out where!) You should go read this entire comment from Brian, because it’s great. (The rest of that comment thread is worth reading, too. There are lots of good thoughts there about parenting and academia, in particular.) But, just to quote part of it here:

“I think it is time to start calling BS on such posturing. Nobody works 80 hours a week regularly (as she claimed in one post). It actually is physically impossible* over the long run. I used to be a consultant where you billed every hour. We were a bunch of type As in an environment where we were strongly encouraged to work long hours (indeed it’s how the company made money by paying us a fixed salary and billing hours worked). I think I exceeded 80 hours once in 9 years, and only rarely and only in times of crisis exceeded 60. The official company expectation was 45 (although of course if you wanted a good review you might aim to be a tad above rather than below). We don’t record hours in academia, but I know what 80 looks and I know what 60 and 50 and 40 look like because I measured it so carefully for 450 weeks and I haven’t seen anything truly different here. Most young profs are in the 40-60 hour range is my belief with most in the lower half of that. And yes 50 hours plus rest of life feels crazy and insane. But stop saying it’s 80 and making everybody else feel guilty they’re not measuring up. The game is incented to exaggerate how much you work, so believe those numbers other people throw out at your risk.”

<cutting lots of great thoughts that you really should go read>

“*Do the math on working 80 hours/week -112 waking hours – 14 hours/week eating/grooming/maintaining car house – 5 hours commuting = 83 hours and that is pretty sparse grooming and maintaining – e.g. no exercise – and nobody lives on 3 hours/week leisure time)”

Why does this myth persist? Probably it’s in part because, if you think everyone else is working 80 hours a week, it can seem risky to admit that you aren’t, since that could make you seem like a slacker.

But I think another important reason for the persistence of this myth is that people are bad at recognizing how much they actually work. Unlike Brian, most of us haven’t spent years tracking our exact hours worked, and so don’t have a realistic sense of what an 80 hour work week would really feel like. As a grad student and postdoc, I thought I worked really hard. But then I made myself start logging hours (sort of like I was keeping track of billable hours, though I was simply doing it out of curiosity). I was astonished at how little I actually worked. It was something like 6 hours of actual work a day. I never would have guessed it was that low. I hadn’t realized how much time I was spending on those seemingly little breaks between projects. I used to count a sample, then go read an article on Slate, then go count another sample, then go read another article, etc. At the end of the day, if you’d asked what I’d done, I would have said I’d spent all day counting samples. But, in reality, I had probably only spent roughly half my day actually counting samples. I found this exercise really valuable and eye-opening. I think it probably did more to make me more efficient in how I work than anything else. And working efficiently frees up lots of time for other things (including spending time with my kids). I’ve recommended this to people who were struggling to keep up with tasks they needed to accomplish, and also have recommended keeping track of basic categories (maybe research, teaching, and service) when doing this accounting to see if the relative time devoted to those tasks seems reasonable.

So how much do I work? That has varied over the years, not surprisingly. When I started my first faculty position, there were times when I felt like I was working as hard as I possibly could, and I started to wonder if I was working 80 hours a week. So, I tallied the hours. It was about 60 hours/week. And that was during a really time-intensive experiment, and was a relatively short-term thing. (I’m not sure, but that might be similar to the amount I worked during the peak parts of field season in grad school.) I could not have maintained that schedule over several months without burning out, regardless of whether or not I had kids. Right now, I’d say I typically work 40-50 hours a week. I am in my office from 9-5, and I work as hard as I can during that time. I usually can get some work done after the kids go to bed, but there’s also prepping bottles to send to daycare the next day, doing dishes, etc., so I definitely have less evening work time than I used to. And I usually get a few hours total on the weekend to work, but that’s variable.

Again, I think the key is being efficient. This article has an interesting summary of history and research behind the 40 hour work week. It argues (with studies to back up the argument) that, after an 8 hour work day, people are pretty ineffective:

“What these studies showed, over and over, was that industrial workers have eight good, reliable hours a day in them. On average, you get no more widgets out of a 10-hour day than you do out of an eight-hour day. Likewise, the overall output for the work week will be exactly the same at the end of six days as it would be after five days. So paying hourly workers to stick around once they’ve put in their weekly 40 is basically nothing more than a stupid and abusive way to burn up profits. Let ‘em go home, rest up and come back on Monday. It’s better for everybody.”

That article points out that there is an exception – occasionally, you can increase productivity (though not by 50%) by going up to a 60 hour work week. But, this only works for a short term. This matches what I’ve found in my own work (see previous paragraph) and also seems to match with the quote from Brian above.

So, please, do not think that you need to work 80 hours a week in academia. If you are working that many hours, you are probably not being efficient. (I’m sure there are exceptional individuals who can work that long and still be efficient, but they are surely not the norm.) So, work hard for 40-50 hours a week (maybe 60 during exceptional times), and then use the rest of the time for whatever you like***. And, please, please, please, stop perpetuating the myth that academics need to work 80 hours a week.

* People who are regular readers of this blog will know that I don’t think there’s anything wrong with non-academic careers. I simply want people to make their decisions based on accurate information, and don’t want someone choosing to step off the tenure track primarily because of the myth that it requires 80 hour work weeks.

** As it turns out, investment bankers are being encouraged to work less, though “less” is still a whole lot by most standards. (Here’s another story on the same topic.)

*** I encourage exercise as one way to use some of that time. (Perhaps that’s not a surprise, given that I have a treadmill desk.) In talking with other academics, it seems that exercise is often one of the first things to go when things get busy. I enjoyed this post by Dr. Isis, which explains why she decided to start prioritizing exercise again. (The comments on that post are good, too.) When I made myself mentally switch from saying “I don’t have time to exercise” to “I am choosing not to prioritize exercise”, I suddenly got much better at working exercise into my schedule.

logo_square

Me and My Shadow CV

Author:
Original: Chronicle of Higher Education


This fall I’m serving as the designated coach for doctoral students in my department who are on the academic job market. They’re a talented group, with impressive skills, hopes, and dreams. I’m grateful to be guiding them, as they put their best selves before search committees. However, one part of the work is not all that pleasant: I also need to ready them to face mass rejection.

Regardless of any happy outcomes that may await, they’re about to endure what may be their first experience of large-scale professional rebuff. Before, during, and after college, they sought part-time and full-time jobs and applied to graduate schools. They didn’t get hired, or they didn’t get in to some of those schools, naturally. But now they’re putting themselves in line for 40, 50, or more rejections within the space of weeks and months — on the heels of a grueling, humbling few years of dissertation writing.

I feel their pain, to some extent. Those of us on the job market a decade or more ago got our mass rejections in thin envelopes or via email in May or June, after we’d had a few closer looks and maybe even a job offer. Today’s candidates learn they’re out of the running for coveted jobs much sooner, and secondhand, by confronting another candidate’s report of an interview or an offer on the Academic Job Wiki.

That then-and-now difference got me thinking about how we teach graduate students to face academic rejection. Of course, we largely don’t. Rejection is something you’re supposed to learn by experience, and then keep entirely quiet about. Among academics, the scientists seem to handle rejection best: They list on their CVs the grants they applied for but didn’t get — as if to say, “Hey, give me credit for sticking my neck out on this unfunded proposal. You better bet I’ll try again.” Humanists — my people — hide our rejections from our CVs as skillfully as we can. Entirely, if possible.

That’s a shame. It’s important for senior scholars to communicate to those just starting out that even successful professors face considerable rejection. The sheer scope of it over the course of a career may be stunning to a newcomer. I began to think of my history of rejection as my shadow CV — the one I’d have if I’d recorded the highs and lows of my professional life, rather than its highs alone.

More of us should make public our shadow CVs. In the spirit of sharing, I include mine here in its rough outline, using my best guesses, not mathematical formulas. (I didn’t actually keep a shadow CV, despite predictable jokes I may have made in the past about wallpapering my bathroom with rejection letters.)

  • What my CV says: I have published many articles in refereed journals. What my shadow CV would say: Multiply that 3x to get the approximate number of rejections I’ve received. Earlier in my career, it was more like 4x; now it’s closer to 2x. That does not count “revise and resubmit” letters. Fortunately, the rejections do seem to get nicer, as I learn better how to present work for publication and to select journals that are a good fit for my work. I also receive more invitations to contribute, providing better odds for acceptance.
  • What my CV says: I have published books at a great university press. What my shadow CV would say: My first book was rejected six times at the proposal stage before it found a home. One of them was a report so nasty it made me question my will to write another sentence.
  • What my CV says: I’ve edited several collections of essays. What my shadow CV would say: One collection was rejected 12 times at the proposal stage. Another collection almost imploded due to conflict among contributors. A savvy press editor smoothed the ruffled feathers. That’s not all. I co-wrote a book that was under contract but was canceled by the university press’s marketing department. That book never saw the light of day. And another co-edited book, commissioned by a professional organization and some distance along, was canceled by the press and then by the organization.
  • What my CV says: I’ve received some grants and fellowships. What my shadow CV would say: Multiply that total 5x to get the number of grant rejections I’ve received — with, again, the most depressing rates of rejection coming earliest in my career. Early on, I would apply for four to eight grants or fellowships, and receive none or one. I applied for one grant eight times before receiving it. I like to think the organization finally awarded it because they were tired of hearing from me, but maybe my application actually improved.
  • What my CV says: I’ve taught at five fabulous institutions. What my shadow CV would say: This one is the worst. In the process of trying to solve a two-body problem, I was on the job market a lot. I think I’ve been rejected for nearly 400 college teaching jobs and postdoctoral fellowships. In other words, I got offered less than 2 percent of the jobs I applied for, and I’m by no means among the hard-luck cases.
  • What my CV says: I have won elections to office in my professional organization. What my shadow CV would say: I have lost about half as many elections as I’ve won. I’ll take those odds!
  • What my CV says: I have some great recommenders. What my shadow CV would say: They are great. I’ve cried in front of a few them. Academic life has been stressful. (Also, thank you for those hundreds of recommendation letters. They made everything possible.)
  • What my CV says: I have had some great students. What my shadow CV would say: They are great. A few have cried in front of me. Academic life is still stressful. (And you’re welcome for those hundreds of recommendation letters. I may still owe more to the universe than I have given.)
  • What my CV says: I have published in and been quoted in popular media. What my shadow CV would say: You can’t really count the number of times that The New York Times didn’t call you for a quote, so no formula there.

I made many failed attempts at getting my work in print, while learning how to write for new audiences and building relationships with editors. Let’s call this rejection factor 4x, on average, although many of those rejections were not of pieces that eventually saw print but those that never did.

In total, these estimates suggest I’ve received in the ballpark of 1,000 rejections over two decades. That’s 50 a year, or about one a week. People in sales or creative writing may scoff at those numbers, but most of my rejections came in the first 10 years of my academic career, when I was searching intensely for a tenure-track job. Very few came during the summer, when academic-response rates slow to a crawl. I remember months when every envelope and every other email seemed to hold a blow to the ego. My experience was not unusual. Unfortunately, a multiyear job search is, if anything, more common now for would-be academics than when I was on the market.

Most of us get better at handling rejection, although personally, it can still knock the wind out of me. Usually in those moments, I recall something a graduate-school professor once said after I railed at, and — much to my embarrassment — shed a few tears over a difficult rejection: “Go ahead,” he said. “Let it make you angry. Then use your anger to make yourself work harder.”

It sounds so simple. Whether any single rejection is fair or unfair doesn’t ultimately matter. What matters is what you do next. You could let rejection crush you. Or you could let it motivate you to respond in creative, harder-working, smarter-working ways. (I’m convinced, though, that rejection is particularly tough to take in academe because so much of our work is mind work, closely tied to our own identities and sense of self-worth.)

A CV is a life story in which just the good things are recorded, yet sometimes I look at it and see there what others cannot: the places I haven’t been, the journals where my work wasn’t accepted, the times a project wasn’t funded, the ways my ideas were judged inadequate. I’ve started to imagine my CV as a record of both highlight-reel wins and between-the-lines losses. If you’re lucky, you will, like me, also one day come to recognize the places where the losses — as painful as they were at the time — led to unexpectedly positive things. Slammed doors, it turns out, may later become opened ones.

When I was meeting with my department’s academic-job seekers recently, one of them asked me about the last time I was rejected.

“My last rejection was one week ago,” I admitted to them, feeling uncomfortably like someone introducing myself at an AA meeting. “I got two rejections, in fact. One was really, really hard to accept, and, I think, wrong. But I’ll take it for what it’s worth and try again.”

Increasingly, I see rejection as a necessary part of every stage of an academic career. I remind myself that the fact that I’m still facing rejection is evidence that I’m still in the game at a level where I should be playing. I’m continuing to hone my skills and strive for better opportunities — continuing to build both my CV and my shadow CV. Each version is necessary as we seek to advance our research, teaching, and service, the activities to which some of us — and I wish there were many more of us — have the good fortune to devote our professional lives.

On Critical Abyss-Gazing: Depression & Academic Philosophy

Author: Jake Jackson
Original: PhDisabled


Content note: This post involves frank discussion of the experience of depression and includes reference to the recent suicide by Robin Williams.


A few months ago, the night before a conference in which I was participating, I let slip to the Chair of a philosophy department that I often have trouble sleeping. He asked why.

Realizing I may have revealed more than is perhaps savory for having just met, I stammered: “Why, I’m an existentialist!”

The catchphrase fit. After all, the next day I was presenting a paper that dealt with Kierkegaard and Nietzsche on (un)certainty and faith. He then laughed, made a joke of it himself, but gave a knowing-yet-compassionate look.

I was safe. Even in the form of a joke, this was perhaps one of only two instances where I have openly implied the presence of my lifelong depression to a tenured faculty member in my field without regretting it or worrying about how it might affect their perception of me.

This post seeks to question the way that academic philosophy perceives depression. I am not writing this with statistics or numbers, but instead from the subjective phenomenological perspective of someone who has depression and who works in – and aspires to build a career in – academic philosophy.

I seek not to grind an axe against any particular persons or institutions, but instead want to focus on the sort of social context confronted by those with depression, based on my lived experiences.

Depression is an alienating illness, especially when coupled with anxiety, as happens frequently. In my experience in academic philosophy circles, that alienation is amplified since mental health is not spoken of as a real entity. It is instead catalogued and discriminated by logic and reason as something other, an outside factor. The depressed are outsiders.

Depression is treated with a deafening silence, both inside of the academy and outside in society at large.

There is a social unseemliness to discussions of depression. Mental illness is a two-fold problem, private and yet public: private in that it is often suffered alone, public in that its effects reach out further than just the atomized individual.

Social behavior is socially determined, or at least, prescribed. This naturally turns the personal experiences and troubles of every private individual into a public concern. When someone admits to experiencing depression, whether chronic or a phase, this fact becomes a public concern. We look to role models, finding only a public-shaming of role models who suffer mental illness. Public figures who admit to mental illness are asked rushed questions on the intimate details of their struggle. Everyone has an opinion on mental illness, and most of them are not only wrong but directly harmful to both individuals who suffer silently and society at large.

We are not beyond a society that sees mental illness as a stain within one’s soul, some present-age demons who continue to torment mortals. Mental illness still stands as something to be ashamed of because we want to believe in karma or something similar. We want to believe that the ills that we suffer are somehow dependent upon something we deserve.

Those of us who are more scientifically inclined want to believe that we can redeem and fix mental illness, as if it were machinery. If we could only figure out the brain, then we believe that we could “normalize” it, or better, “cure” it.

We wish for so much that it blots out the actual condition. All this wishing and hoping is a flight from the actual day-to-day concerns of depression. As Nietzsche states “Hope is the worst of all evils, for it prolongs the suffering of people.”

Anything that disturbs a social norm makes everyone uncomfortable or at the very least brings up strong opinions. The recent suicide of Robin Williams has shown us yet again that the public doesn’t like talking about depression, certainly not in honest terms. Any suicide, but especially one of a public figure, becomes hyper-moralized. Now is the time for people to condemn Williams with words such as “cowardly” or “selfish” for taking his own life, but then also “brave” for struggling with his depression for so long. Other foolish moralists will say that depression is a divine gift as it comes along with comedic ability, hand in hand.

These moral arguments come out again each time in vain. They are in vain since they try to rationalize the brutally irrational. The overbearing social stigma of depression makes a lot of sense at times. It is very uncomfortable to think that one can be one’s own worst enemy, that the mind can so pessimistically stand against reason or external pleasures. It is, indeed, unseemly.

However, it is this very unseemliness that is the reason that depression should be more openly discussed. It is constantly suppressed socially into restrictive norms that only exponentially increase depression’s own horrid effects of alienation and resentment.

Having high hopes for a radical social change regarding mental health is perhaps going to be nothing but a disappointment. This, however, does not mean that one should give up hope for change and radical action.

I think it should be the job for philosophy to demand that society’s discourse regarding mental health gets less awful. Good philosophy should offer alternatives for social problems, or at the very least scold the often careless ideologies that cause social problems.

But first, academic philosophy itself needs to turn its gaze to depression and how it is treated within its own ranks. We treat it with silence. No one finds it polite to speak on it, unless talking about the personal lives of the dead or as a dry systematic theory. We philosophers prefer to hold depression at arm’s length, even though it often lives so close within our chests as a tightening knot limiting our actions.

Depression is brutally irrational. It does not care for one’s successes, relationships, or anything else that is valued for a so-called good life. No matter how much one moves towards eudaemonia in one’s life, depression is there, lurking. As Winston Churchill described it, depression follows one around like a big black dog ever obedient to its master.

Depression drives me to gaze into abysses.

My philosophical interests rest at the intersection of ethics, phenomenology, and existentialism. I work heavily in Nietzsche and late Husserl, but have recently expanded into working on Kierkegaard and Sartre. None of these historical figures are light reading in any sense of the term. Nietzsche was clearly the king of the abyss and suffered a horrifying debilitating illness which destroyed his mind and his body. Towards the end of his life, Husserl lost a son to the First World War and witnessed his rights dissolve as a Jewish intellectual in Germany. Kierkegaard struggled with his faith and anxiety throughout his life’s work. Sartre fought in the Second World War in the French Resistance and was notoriously bitter in his personal relationships. None of these figures are happy role models. A certain sadness produces good work, it would seem. That same certain sadness reflects on the page. I could, perhaps, “lighten up” and go towards lighter fare, work on thinkers who don’t reach such sad depths, but I don’t find much interest in such things. I instead stay the course in developing an ethics that looks right into horrible things that people do.

My depression drives me towards a weighted sense of responsibility and is the reason I work in philosophy and ethics.

But we do not want to talk about it in the Academy. Despair and anxiety are seen as more suitable on a dissection table in a sterile setting. Even if depression is what drives us towards prolific writing, we stay quiet on its daily presence. We speak instead of depression as the motive for past generations, holding off from any honesty about ourselves and our motivations today.

In my MA program, I had several interactions with other graduate students in philosophy with different approaches towards depression, but universally, it is treated as a shameful subject. Many act horribly insecure about their mental health, either secretive or, worse, bullying others who show any sign of depression, perceiving it like a weakness and those who evince it as prey.

I did speak with colleagues about my depression and anxiety. It hardly went well. One especially insecure classmate spoke with a nostalgia for the days when depression was called melancholia. In other words, he pined for the ‘good old days’ of misdiagnosis and mistreatment at the hands of deliberately ableist pseudoscience. Another former classmate who studies the intersections of psychoanalysis and philosophy quite hypocritically mocks anyone who is honest about their feelings. So moving forward, I buried mine.

Consequently, I let my depression take too much hold over me during this program. Things got particularly low when I faced a major setback in my studies at the very same time that I had a dramatic falling-out with some family members. My worsening depression alienated me from friends and colleagues. It fed itself. At the insistence of my spouse, I finally sought professional help which allowed me to put my depression and anxiety into a much more manageable condition. Even so, I stayed ashamed of my condition throughout my MA program. I avoided talking to anyone in my department about anything at all, let alone my depression.

At the point where I began antidepressants and laid off of drinking for a couple weeks to regulate, one of my classmates noticed. I mentioned that I was on a new medication; I did not mention what. He too gave that knowing and understanding look.

Both of us looked at each other knowing that we were struggling with the same condition, but saying nothing. Never did we say a thing about it.

There’s a certain intersubjective co-understanding here: the depressed recognize the depressed easily. But ashamed, we say nothing in fear of outing ourselves, admitting anything in honesty. Perhaps it was the program I was in, but insecurities ratcheted up and became more secret, more insecure and ready to explode.

Instead, I spoke to others outside of my department through internet communities that understand and employ an important sense of honesty regarding disability. It just wasn’t ‘proper’ to talk to those who I knew in my program.

All of this shaming stigma needs to stop. Academia, academic philosophy particularly, can get bad enough as a stressful environment. All of our insecurities already rest within the Ivory Tower itself, let alone even trying to stay within it. Impostor syndrome is rife, yet shame in mental illness is pervasive. At the very least, all this mental illness-shaming seems like a waste of time and energy. At the very worst, it creates a subculture of alienated, disillusioned individuals who cannot trust one another, or their own attempts to see the strength inherent in the hard work they invest in living – surviving – with depression.

Soon after the First World War and losing his son, Husserl wrote to Arnold Metzger that:

“You must have sensed that this ethos is genuine, because my writings, just as yours, are born out of need, out of an immense psychological need, out of a complete collapse in which the only hope is an entirely new life, a desperate, unyielding resolution to begin from the beginning and to go forth in radical honesty, come what may.”

Mental illness must be treated with a collective commitment to radical honesty that comes from recognizing our shared responsibility to ourselves and each other.

We academic philosophers must pick up this radical honesty when it comes to mental illness before collapse.

We need to look into our motivations more critically in order to live more ethically together. If we are to claim ourselves as a higher critical institution of people, we must open the discourse on mental health. This is not a call for sympathy, but for honesty among all parties involved in academia. Now, as I start a new PhD program, I am hoping to overcome oppressive silence with radical honesty, staying open before others and combating shaming stigma whenever I find it.

Online Collaboration: Scientists and the Social Network

Author: Richard Van Noorden
Original: Excerpts reprinted by permission from Macmillan Publishers Ltd: Nature 512,126-129, copyright (2014)


twitter_nature

Why scholars use social media (Twitter)
Adapted by permission from Macmillan Publishers Ltd: Nature 512,126–129, copyright (2014)

“A few years ago, the idea that millions of scholars would rush to join one giant academic social network seemed dead in the water. The list of failed efforts to launch a ‘Facebook for science’ included Scientist Solutions, SciLinks, Epernicus, 2collab and Nature Network (run by the company that publishes Nature). Some observers speculated that this was because scientists were wary of sharing data, papers and comments online — or if they did want to share, they would prefer do it on their own terms, rather than through a privately owned site.

But it seems that those earlier efforts were ahead of their time —or maybe they were simply doing it wrong. Today, ResearchGate is just one of several academic social networks going viral. San Francisco-based competitor Academia.edu says that it has 11 million users. “The goal of the company is to rebuild science publishing from the ground up,” declares chief executive Richard Price, who studied philosophy at the University of Oxford, UK, before he founded Academia.edu in 2008, and has already raised $17.7 million from venture capitalists. A third site, London-based Mendeley, claims 3.1 million members. It was originally launched as software for managing and storing documents, but it encourages private and public social networking. The firm was snapped up in 2013 by Amsterdam-based publishing giant Elsevier for a reported £45 million (US$76 million).”

“Despite the excitement and investment, it is far from clear how much of the activity on these sites involves productive engagement, and how much is just passing curiosity — or a desire to access papers shared by other users that they might otherwise have to pay for. . . . In an effort to get past the hype and explore what is really happening, Nature e-mailed tens of thousands of researchers in May to ask how they use social networks and other popular profile-hosting or search services, and received more than 3,500 responses from 95 different countries.”

For study infographics, see below. For more on the survey findings and to read the complete Nature article: http://www.nature.com/news/online-collaboration-scientists-and-the-social-network-1.15711.


nature-remarkable-reach

Adapted by permission from Macmillan Publishers Ltd: Nature 512,126–129, copyright (2014)

nature-idle-browse-or-chat

Adapted by permission from Macmillan Publishers Ltd: Nature 512,126–129, copyright (2014)

‘Failure’ of Graduate Education is No Joke

Author: Melonie Fullick
Original: University Affairs | Speculative Diction


Recently University Affairs published an interview with Kevin Haggerty and Aaron Doyle, two Canadian professors who have written a book of advice for graduate students. The book’s gimmick, if you want to call it that, is that it’s presented as a guide to failing—an anti-guide, perhaps?—as evidenced by the title, 57 Ways to Screw up in Grad School: Perverse Professional Lessons for Graduate Students. According to Haggerty and Doyle, “students often make a series of predictable missteps that they could easily avoid if they only knew the informal rules and expectations of graduate school.” If only! And this book, we’re told, is designed to help solve that problem.

Dropping all sarcasm, the first thing I have to say is: really?giphy

… grad students’ “failure” is somehow all about the mistakes they make? How many times do we have to take this apart before faculty giving this kind of “advice” start realizing how it sounds? Maybe this is just a part of the “joke” and I’m not getting it, but how long is it going to be before the irrational and erroneous assumption that student success is entirely about individuals and their intrinsic merit and skills, is displaced with a more realistic perspective?

Haggerty and Doyle have been promoting their book in the higher ed press for a couple of months now. While the University Affairs interview is relatively subdued, I want to bring your attention to their August 27th piece in  Times Higher Education (THE), in which we were treated to a lively sample of just 10 of the ways grad students can ruin their own chances of academic success. Back when the article was first published I shared a series of critiques on Twitter; I’m going to risk boring you by repeating them here, because the book is receiving attention and there are some fundamental problems with the ideas that it reflects and reinforces.

At the outset, the authors explain how they’ve “concluded that a small group of students actually want to screw up. We do not know why. Maybe they are masochists or fear success.” This sort of set-up trivializes and dismisses serious problems; but things get worse from that point. Here are a few of the “screw ups” listed in the THE article, along with some of the criticisms they provoked from me and others:

  • “Stay at the same university” for all your degrees. This assumes students have unimpeded mobility, and a degree of financial security, from an early stage. Mobility is also affected by your family situation, for example—if you’re married with a partner who can’t relocate, or if you have dependents who need to stay where they are, this is a problem.
  • “Choose the coolest supervisor.” Bad supervision can be career-limiting. But who is going to be really honest and tell prospective students about a faculty member who is (for example) a research “star” but also a terrible supervisor? Not the program chair; not the other faculty; and not the students who are happiest to represent the program. It’s also possible that your prof is new to supervision and not equipped to handle it. The student is being imagined here as a consumer who has the responsibility to make an informed choice, even when the relevant information isn’t available or when they don’t actually have the option.
  • “Expect people to hold your hand.” The issue of responsibility is already a sketchy one in graduate education, and we don’t all arrive with the amount of cultural capital that’s required to be autonomous or “just know” what we’re responsible for—and what we can reasonably ask of a supervisor. So, what constitutes appropriate mentorship and guidance, and what is merely “hand-holding”? Who gets to decide? (Hint: not the students.)
  • “Concentrate only on your thesis.” Assuming, of course, that this is an option for everyone. When the authors suggest non-thesis activities, though, these are not things like family time or self-care (and definitely not a job), but other academic professionalization activities such as authoring journal articles and attending conferences—as if grad students don’t already receive the message that they must do All The Things if they want to be minimally employable.
  • “Have thin skin.” As with other things on the list, this is a difficult call because it’s so subjective, and the party giving feedback is often also in a position to define its appropriateness. As I said in a tweet, giving and receiving feedback, like most professional skills, requires practice and modeling—and that’s a two-way street.

I hope you’ll forgive me for not finding the topic of grad student “failure” an amusing one. Usually I like a good joke (especially at the expense of academe), but I just don’t see how it’s appropriate for this issue. The “light-hearted” approach is grating to me, and I wasn’t alone. Reactions to the article included: “horribly smug”; “their post is not amusing”; “I’m a Professor who has supervised dozens of PhDs and I disagree with almost all of what the authors said”; “this is ridiculous”; “clickbait, consumerism, classism”; “I had trouble getting past #1”; and “much could be turned around into ‘do your job, grad schools.’

What’s even more frustrating is that almost every point made in the THE article could have been made in a helpful, critical and inclusive way, and simply wasn’t. In choosing this particular approach to “advice” the authors render their points not only unpalatable, but also condescendingly uncritical. Even if the advice is potentially of use, why put it in terms that are exclusionary to some students, and infantilizing to all? The authors make the argument they’re sharing tacit knowledge, thus doing us all a favour. But they also seem to be ridiculing students from not having this knowledge at the outset. The use of the word “guilty” (in their interview) just reinforces the feelings many students already experience when they discover something’s going wrong.

Haggerty and Doyle aren’t alone in their assumptions, and that’s why these kinds of articles and books represent a problem. They aren’t mere one-offs; as I’ve argued before (and no doubt you’re all sick of hearing it), it’s still too convenient for graduate programs and supervising faculty to dismiss students’ “failure” as a problem with selection of students, students’ lack of commitment, and/or a bad “fit”—an approach that shifts the blame away from problems of supervisory competence, appropriate social and academic support, and departmental climate and culture. That this perspective is espoused publicly by respected senior faculty members who not only supervise grad students but have also spent time as graduate chairs, shows how pervasive and influential it is in academe.

As always, I’m not trying to argue that students have no responsibility for their own success. What I’m responding to is the framing of this as a problem almost entirely in their hands. We already know (from research, in fact) that this is an inaccurate depiction, and that students’ experiences in graduate education are affected at least as much by the supervisor, department, and peer group—as well as by structural factors such as class, race, gender, and disability—as they are by individual merits and choices.

I’m aware that the book will provide more detail than a short post on THE, but because it’s the framing rather than the content that’s a problem, maybe “less is more.” You don’t need a book like this when the same or better advice is available from people who’ll give you a constructive and critical perspective on professionalization and the norms and values of academe—the latter having been taken for granted by Haggerty and Doyle. I recommend you check out those diverse perspectives instead—there are too many to list here, but a few online sources that spring to mind are Pat Thomson, The Thesis Whisperer, Conditionally Accepted, PhDisabled, Explorations of Style, Gradhacker, and also (for some background) the bibliography of research on doctoral education that I linked to above. You can also try #phdchat on Twitter, where you’ll find a wealth of resources.

Given the variety and quality of the research and resources available, surely at this stage there’s no excuse to reiterate the same old tired themes about irresponsible students and the silly mistakes they make. I only hope we can move beyond this in future debates about graduate education.

Academic Assholes and the Circle of Niceness

Author: Inger Mewburn
Original: The Thesis Whisperer


Two of my favourite people in the academic world are my friends Rachael Pitt (aka @thefellowette) and Nigel Palmer. Whenever we have a catch up, which is sadly rare, we have a fine old time talking shop over beer and chips (well lemonade in my case, but you get the picture).

Some time ago ago Rachael started calling us ‘The B Team’ because we were all appointed on a level B in the Australian university pay-scale system (academic Level B is not quite shit kicker entry level academia – that’s level A just in case you were wondering – but it’s pretty close). I always go home feeling a warm glow of collegiality after a B team talk, convinced that being an academic is the best job in the entire world. Rachael reckons that this positive glow is a result of the ‘circle of niceness’ we create just by being together and talking about ideas with honesty and openness.

Anyway, just after I announced my appointment as director of research training at ANU, the B team met to get our nerd on. As we ate chips we talked about my new job, the ageing academic workforce, research student retention rates. Then we got to gossiping — as you do.

All of us had a story or two to tell about academic colleagues who had been rude, dismissive, passive aggressive or even outright hostile to us in the workplace. We had encountered this behaviour from people at level C, D and E, further up in the academic pecking order, but agreed it was most depressing when our fellow level Bs acted like jerks.

As we talked we started to wonder: do you get further in academia if you are a jerk?

Jerks step on, belittle or otherwise sabotage their academic colleagues. The most common method is by criticising their opinions in public, at a conference or in a seminar and by trash talking them in private. Some ambitious sorts work to cut out others, whom they see as competitors, from opportunity. I’m sure it’s not just academics on the payroll who have to deal with this kind of jerky academic behaviour. On the feedback page to the Whisperer I occasionally get comments from PhD students who have found themselves on the receiving end  — especially during seminar presentations.

I assume people act like jerks because they think they have something to gain, and maybe they are right.

In his best selling book ‘The No Asshole Rule’ Robert Sutton, a professor at Stanford University, has a lot to say on the topic of, well, assholes in the workplace. The book is erudite and amusing in equal measures and well worth reading especially for the final chapter where Sutton examines the advantages of being an asshole. He cites work by Teresa Amabile, who did a series of controlled experiments using fictitious book reviews. While the reviews themselves essentially made the same observations about the books, the tone in which the reviewers expressed their observations was tweaked to be either nice or nasty. What Amabile found was:

… negative or unkind people were seen as less likeable but more intelligent, competent and expert than those who expressed the the same messages in gentler ways

Huh.

This sentence made me think about the nasty cleverness that some academics display when they comment on student work in front of their peers. Displaying cleverness during PhD seminars and during talks at conferences is a way academics show off their scholarly prowess to each other, sometimes at the expense of the student. Cleverness is a form of currency in academia; or ‘cultural capital’ if you like. If other academics think you are clever they will listen to you more; you will be invited to speak at other institutions, to sit on panels and join important committees and boards. Appearing clever is a route to power and promotion. If performing like an asshole in a public forum creates the perverse impression that you are more clever than others who do not, there is a clear incentive to behave this way.

Sutton claims only a small percentage of people who act like assholes are actually sociopaths (he amusingly calls them ‘flaming assholes’) and talks about how asshole behaviour is contagious. He argues that it’s easy for asshole behaviour to become normalised in the workplace because, most of the time, the assholes are not called to account. So it’s possible that many academics are acting like assholes without even being aware of it.

How does it happen? The budding asshole has learned, perhaps subconsciously, that other people interrupt them less if they use stronger language. They get attention: more air time in panel discussions and at conferences. Other budding assholes will watch strong language being used and then imitate the behaviour. No one publicly objects to the language being used, even if the student is clearly upset, and nasty behaviour gets reinforced. As time goes on the culture progressively becomes more poisonous and gets transmitted to the students. Students who are upset by the behaviour of academic assholes are often counselled, often by their peers, that “this is how things are done around here” . Those who refuse to accept the culture are made to feel abnormal because, in a literal sense, they are – if being normal is to be an asshole.

Not all academic cultures are badly afflicted by assholery, but many are. I don’t know about you, but seen this way, some of the sicker academic cultures suddenly make much more sense. This theory might explain why senior academics are sometimes nicer and more generous to their colleagues than than those lower in the pecking order. If asshole behaviour is a route to power, those who already have positions of power in the hierarchy and are widely acknowledged to be clever, have less reason to use it.

To be honest with you, seen through this lens, my career trajectory makes more sense too. I am not comfortable being an asshole, although I’m not going to claim I’ve never been one. I have certainly acted like a jerk in public a time or two in the past, especially when I was an architecture academic where a culture of vicious critique is quite normalised. But I’d rather collaborate than compete and I don’t like confrontation.

I have quality research publications and a good public profile for my scholarly work, yet I found it hard to get advancement in my previous institution. I wonder now if this is because I am too nice and, as a consequence, people tended to underestimate my intelligence. I think it’s no coincidence that my career has only taken off with this blog. The blog is a safe space for me to show off display my knowledge and expertise without having to get into a pissing match.

Like Sutton I am deeply uncomfortable with the observation that being an asshole can be advantageous for your career. Sutton takes a whole book to talk through the benefits of not being an asshole and I want to believe him. He clearly shows that there are real costs to organisations for putting up with asshole behaviour. Put simply, the nice clever people leave. I suspect this happens in academia all the time. It’s a vicious cycle which means people who are more comfortable being an asshole easily outnumber those who find this behaviour obnoxious.

Ultimately we are all diminished when clever people walk away from academia. So what can we do? It’s tempting to point the finger at senior academics for creating a poor workplace culture, but I’ve experienced this behaviour from people at all levels of the academic hierarchy. We need to work together to break the circle of nastiness.

It’s up to all of us to be aware that we have a potential bias in the way we judge others; to be aware that being clever comes in nice and nasty packages. I think we would all prefer, for the sake of a better workplace, that people tried to be nice rather than nasty when giving other people, especially students, criticism about their work. Criticism can be gently and firmly applied, it doesn’t have to be laced with vitriol.

It’s hard to do, but wherever possible we should work on creating circles of niceness. We can do this by being attentive to our own actions. Next time you have to talk in public about someone else’s work really listen to yourself. Are you picking up a prevailing culture of assholery?

I must admit I am at a bit of a loss for other things we can do to make academia a kinder place. Do you have any ideas?

Academic Journals: The Most Profitable Obsolete Technology in History

Author: Jason Schmitt
Original: Huffington Post


The music business was killed by Napster; movie theaters were derailed by digital streaming; traditional magazines are in crisis mode–yet in this digital information wild west: academic journals and the publishers who own them are posting higher profits than nearly any sector of commerce.

Academic publisher Elsevier, which owns a majority of the prestigious academic journals, has higher operating profits than Apple. In 2013, Elsevier posted 39 percent profits, according to Heather Morrison, assistant professor at the University of Ottawa’s School of Information Studies in contrast to the 37 percent profit that Apple displayed.

This lucrative nature of academic publishing comes at a price–and that weight falls on the shoulders of the full higher education community which is already bearing the burden of significantly decreasing academic budgets. “A large research university will pay between $3-3.5 million a year in academic subscription fees –the majority of which goes to for-profit academic publishers,” says Sam Gershman, a postdoctoral fellow at MIT who assumes his post as an assistant professor at Harvard next year. In contrast to the exorbitant prices for access, the majority of academic journals are produced, reviewed, and edited on a volunteer basis by academics who take part in the tasks for tenure and promotion.

“Even the Harvard University Library, which is the richest university library in the world, sent out a letter to the faculty saying that they can no longer afford to pay for all the journal subscriptions,” says Gershman. While this current publishing environment is hard on large research institutions, it is wreaking havoc on small colleges and universities because these institutions cannot afford access to current academic information. This is clearly creating a problematic situation.

Paul Millette, director of the Griswold Library at Green Mountain College, a small 650 student environmental liberal arts college in Vermont, talks of the enormous pressures access to academic journals have placed on his library budgets. “The cost-of-living has increased at 1.5 percent per year yet the journals we subscribe to have consistent increases of 6 to 8 percent every year.” Millette says he cannot afford to keep up with the continual increases and the only way his library can afford access to journal content now is through bulk databases. Millette points out that database subscription seldom includes the most recent, current material and publishers purposefully have an embargo of one or two years to withhold the most current information so libraries still have a need to subscribe directly with the journals. “At a small college, that is what we just don’t have the money to do. All of our journal content is coming from the aggregated database packages–like a clearing house so to speak of journal titles,” says Millette.

“For Elsevier it is very hard to purchase specific journals–either you buy everything or you buy nothing,” says Vincent Lariviere, a professor at Université de Montréal. Lariviere finds that his university uses 20 percent of the journals they subscribe to and 80 percent are never downloaded. “The pricing scheme is such that if you subscribe to only 20 percent of the journals individually, it will cost you more money than taking everything. So people are stuck.”

Where To Go:

“Money should be taken out of academic publishing as much as possible. The money that is effectively being spent by universities and funding agencies on journal access could otherwise be spent on reducing tuition, supporting research, and all things that are more important than paying corporate publishers,” says Gershman. John Bohannon, a biologist and Science contributing correspondent, is in agreement and says, “Certainly a huge portion of today’s journals could and should be just free. There is no value added in going with the traditional model that was built on paper journals, with having people whose full time job was to deal with the journal, promote the journal and print the journal, and deal with librarians. All that can now be done essentially for free on the internet.”

Although the prior clearly sounds like the path toward the future, Bohannon says from his vantage point the prior is not one-size-fits-all: “The most important journals will always look pretty much like they do today because it is actually a really hard job.” Bohannon finds that the more broad journals such as Science, Nature, and Proceeding of the National Academy of Science (PNAS) will always need privatized funding to complete the broad publication tasks.

Another Option?

“A better approach to academic publishing is to cut out the whole notion of publishing. We don’t really need journals as traditionally conceived. The primary role of traditional journals is to provide peer review and for that you don’t need a physical journal–you just need an editorial board and an editorial process,” says Gershman.

Gershman lays out his vision for the future of academic publishing and says that a very different sort of publishing system would be that everybody could post papers to a pre-print server similar to the currently existing arXiv.org. After posting research, then the creator selects to submit it to a journal, which is essentially sending them the links to your paper on the pre-print server. The journal editorial board do the same editorial process that exists now–if your paper is accepted to their journal they can put their imprimatur on your paper saying it was accepted to this journal–but there is no actual journal–it is just a stamp of approval.

What Gershman’s concept does is remove most of the costs from the equation. The cost for running this pre-print server would be a shared cost for all universities and funding agencies and could clearly infuse millions upwards of billions back into the broad higher education system should an overarching system be implemented and respected. Bohannon is not convinced the prior is an easy sell. “We would need a real revolution. By revolution I mean a cultural revolution among academics. They would have to totally change the way they do business and, despite having the reputation of being revolutionary, academics are pretty conservative. As a culture, academia moves pretty slow.” Nathan Hall, professor at McGill University, follows Bohannon’s reasoning and says, “I think there is a sense of security in maintaining a set of agreements with known publishers with reputations like Wiley or Elsevier. I think universities aren’t quite aware of the benefits and logistics of a new system and they are comfortable maintaining existing relationships despite some questionability for what the publishers are providing.”

Open Access for the Future?

“The phrase ‘open access’ can mean several things,” says Lariviere. Open access on a broad scale refers to unrestricted online access for peer-reviewed research. Lariviere details how publishers have co-opted this terminology and in doing so perhaps increased profit further. “Elsevier says you can publish in open access, but in reality it means paying twice for the papers. They will ask me ‘do you want to publish your paper open access’ which means paying between $500 and $5,000 additional for that specific paper to be freely available to everyone. At the same time, they will not reduce the subscription cost to the overall journal, which means they are making twice the money on that specific paper. If you ask me if this type of open access is the way to go, the answer is no.”

Luckily large granting bodies have begun using their clout to push toward true open access. The National Institute of Health (NIH) has been a longstanding champion for creating open access. Since 2008, the NIH has had a mandate for all research funded by that body to be published open access. Recently, the Bill and Melinda Gates Foundation brought their clout into the open access conversation. Starting in January 2015 all work funded through the Gates Foundation will be open access and the foundation says: “We have adopted an Open Access policy that enables the unrestricted access and reuse of all peer-reviewed published research funded, in whole or in part, by the foundation, including any underlying data sets.”

As higher education is redefined to meet the needs and affordability required of the 21st century certainly the most basic functions of sharing academic research need to be retooled. There is no reason an academic publisher should have such a significantly different economic picture from standard publishers. The stark contrast is troubling as it tells just how far from reality our higher education system has traversed. Correspondingly, there is no reason universities should pay $3.5 million to have access to peer-reviewed data. This academic conversation is society’s conversation–and it is time that the digital revolution level one last playing field: because we, the people, deserve access.

On #PeerRevWk16: An Entirely Cynical Perspective

N. C. Hall  /  12/10/2016


#PeerRevWk16 is an annual effort by academic publishers to bolster flagging peer review participation, quality, and speed through explicit statements of thanks and recognition.

Although this initiative could be viewed as a face-valid effort by a public service industry charged by governments and post-secondary institutions with the sacred, inestimable responsibility of research dissemination, there are major ongoing issues underlying academics’ reluctance concerning peer reviews that this initiative does not discuss. From huge publisher profits afforded by gouging public institutions and not meaningfully compensating academics to unjustifiably high open access fees and peer review patents to stifle competition, there are serious systemic problems underlying the peer review process that this hashtag effort does little to address.

Basically, I started to feel uncomfortable seeing publishers attempt to dominate a hashtag ostensibly “for” academics with tweets containing marketing-department infographics on what academics want, promoting a new reviewer ratings system, or sharing “how-to” guides to cost-effectively improve the quality/speed of free academic labour. In response, it seemed important to balance this profitable status quo narrative by highlighting the uncomfortable realities of the peer review process for academics. I am by no means an expert on higher education policy/ethics/economics, I just wanted to share information and balance the discussion about how to promote research quality by better supporting those who do it.

It all started a few weeks ago when I first noticed tweets from academic publishers pop up in my timeline underscoring the importance and novelty of thanking peer reviewers as well as quantifying/ranking peer review efforts:

In typical fashion, I responded with flippant sarcastic commentary, thinking it to be an obviously transparent (and hopefully temporary) publisher effort to pacify volunteer reviewers with a pat on the back and self-relevant data:

But this weird gratitude parade only seemed to ramping up and it got me thinking more seriously about the motivation behind these reviewer appreciation efforts:

As a good academic, I supplemented these devastating hot takes with references to external sources outlining the growing dissent concerning the publication process:

With such an eviscerating response to this uncomfortable wave of public publisher affection, I thought my job was done. However, I soon realized there was an actual hashtag for this initiative – #PeerRevWk16 – and an entire week to come of publisher efforts to spam Twitter with pre-scheduled, strategic gratitude PR aimed at thanking academics by educating them as to their peer review value and responsibilities.

Some #PeerRevWk16 publisher tweets hoped to inform researchers of the importance of peer reviews as the cornerstone of scientific inquiry, as if they were somehow not addressing individuals who by definition should be not only intimately familiar with the scientific process but have based their research careers largely on this premise:

Other tweets expressed heartfelt thanks to reviewers for their time and effort through mass cut-and-paste “publishers are people too” gestures garnering remarkably few RTs or replies:

Publisher spam also included regularly scheduled marketing-office infographic blasts educating academics about why they do (read “should do”) peer reviews, with most results ironically showing academics to have already decided on better ways to spend their time:

And then there were the tweets consistently promoting the new reviewer recognition system “Publons”; a publisher-owned effort to bolster peer reviewer commitment by tracking, quantifying, and ranking peer reviewers:

But perhaps the most condescending #PeerRevWk16 tweets were those gently informing academics as to how they could better perform their free publisher labour:

So I admittedly got a bit snarky:

And being on sabbatical, words were soon diverted from manuscript revisions to countering this increasingly awkward, oblivious, and patronizing publisher narrative implying that problematic peer review disengagement could be remedied not by meaningful compensation or real talk about peer review costs, but by a Twitter campaign aimed at educating, flattering, and shaming academics. Again, I’m not an expert on the academic publishing industry, but it seemed important to share some thoughts on issues that were clearly being avoided such as:

1.  The peer review burden on vulnerable academics:

2.  The ethics of peer review compensation:

3.  In-store credit as review compensation:

4.  Financial compensation for peer reviews:

5.  The exclusion of industry expertise:

6.  Peer review sampling bias:

7.  The “gamification” of peer review:

8.  My personal review perspective:

9.  Public perception of publisher appreciation efforts:

So while the #PeerRevWk16 initiative does on the surface present as an effort to simply thank and support peer reviewers, a quick consideration of the academic publishing landscape suggests that it may also represent an effort to whitewash growing public discontent over a massively profitable industry that does shamefully little to show respect for the free academic labour on which it relies:

So for good measure, I doubled down with @AcademicsSay to better punctuate the #PeerRevWk16 publisher noise:

Even Nature got in on the fun:

And despite publisher-provided highlight reels of #PeerRevWk16 in which most of the above is effectively excluded, the narrative that resonated most with academics was obvious:

As to where to go from here, there were a few thoughts:

Maybe it’s just me, but this hashtag effort at best seems intended to distract from publisher problems or promote new publisher products. At worst, it seems a fundamentally misguided attempt to sustain profits by increasing peer review engagement among (a) inexperienced, less expert academics not yet familiar with the scientific process, (b) early career researchers trying any way they can to demonstrate a willingness to sacrifice their time and energy to potential employers, or (c) already overburdened academics disillusioned with the publication process who need and will take the self-esteem boost despite its patronizing tone.

Is a thank you from publishers for peer reviewing appreciated? Perhaps, but that’s not why we do it. And as a transparent attempt to placate a base increasingly dissatisfied with publishers profiting from their good will, institutional/disciplinary pressure, and passion for science, the #PeerRevWk16 effort kinda looks like using the “tip” section of a bill to provide actual tips on how to serve publishers better:

Of course, I might be entirely off-base in interpreting #PeerRevWk16 as anything other than a face-valid attempt to show some much-needed appreciation to hardworking volunteers. But as a leading authority on pandering to academics on Twitter, I can safely say that academic publisher trolling could use some work.

logo_square

What I Learned About Writing by Not

Author: Rebecca A. Adelman
Original: www.rebeccaaadelman.com


All is not lost.  What I have lacked in tangible productivity over my long season of writer’s block (which seems finally to be limping its way to a close), I have gained in new understandings of the intricacies of my writing process and the fussy mechanics of getting words on the page.

When you aren’t getting words on the page, it’s crazy annoying (at best) to hear about people that are.  And it’s similarly unpleasant to receive unsolicited suggestions about how to get yourself unstuck.  As if it was simply a matter of will or ergonomics or mental hygiene.  But if it was that easy, anyone could do it.  Producing good work, and doing it well, takes more than that.  So here are a few things I figured out about being productive when I was struggling to produce anything at all.  It’s an open letter, of sorts, to my writerly self – the “I” is me, and so is the “you.”  But the “you” can also be, you know, you, if you are reading this and wanting to reconsider your writing praxis.

Become attuned to your limits.
It’s hard to tune out the constant drone of academic meta-commentary about how much (or, from the occasional maverick, how little) we work.  And it helps to know that most of those aggrandizing self-reports are bullshit.  But even still, focusing too much on what other people are doing, or not, just leaves me insecure, or anxious, or envious.  So spend less time worrying about what other people are doing and focus on your own patterns. Then figure out how you work, and be honest about whether all the hours you spend “working” are actually that.  For example, I’ve figured out that I’m neither efficient nor terribly lucid after dinner, and that even when I go back to work late in the evening, I’m not getting much done besides maybe assuaging my guilt about not working enough.

Diminishing returns are a thing.  So consider whether you might be better served by reinvesting those mediocre or largely symbolic work hours elsewhere.

Figure out how you want the experience of writing to feel.  
Turns out, there are no extra points for suffering.  Or if they are, they circulate in an economy that is wildly unrewarding.  Like the counters where you redeem your tickets at arcades: a small fortune in tokens and hours spent playing Skeeball leave you with an armload of little cardboard rectangles and the teenager in charge of the whole operation barely acknowledges you when you come to select your prize and it ends up that all you can afford is a pencil case.  Anyway.

Few of us have the luxury, presumably, to only write when it feels good.  Deadlines, tenure, promotion, &c.  But unless you produce your best work in the throes of abject misery, experiment with the novel practice of setting your writing aside when writing feels terrible.  We all have different thresholds for ‘terrible,’ and that terrible feeling might be mental or physical, but when you encounter that threshold, I think it’s smart to heed it. Admittedly, I am still relatively new to the routine of being a peer-reviewer, but I have not yet encountered a reviewer questionnaire instructing me to give special consideration to a project if I think the author cried a lot (A LOT) while they composed it.  And if there are people who will give you extra credit for your anguish, think carefully about whether you want to play by that set of rules.

Spend some time thinking about how it feels when you are doing your best work.  Maybe you feel focused, or excited, or peaceful, or maybe you’re so in it that you don’t feel anything at all.  Take advantage of those times, figure out how to increase their frequency if possible, develop strategies for doing good-enough work in circumstances that only approximate them.  And otherwise: leave it alone.

Work at a pace that’s sustainable.
Pretty much every academic I know, including me, is overcommitted.  There are lots of reasons for this, both individual and structural.  Obviously, everybody will define “overcommitted” in their own ways, and experience being overcommitted idiosyncratically.  I’ll need to figure out, eventually, why I have a tendency to hoard projects, but here’s what I know for now: I tend to overestimate the amount of time that I have before a deadline, while underestimating how much work I will want to put into a given project.  Part of me also imagines that the asteroid will surely hit between now and whatever deadline so it won’t actually matter.

I can manage the consequences of my over- and underestimating (as well as the general paucity of asteroids) fairly well under normal circumstances.  But when shit, inevitably happens, that mismatch becomes acutely untenable.

So: try to plan out your projects and commitments, as best as you are able, so that they align with how busy you want to be, and when, while also maintaining an overall mode of existence that is tolerable.  (Parenthetically, I think academics ought to aspire to existences that are more than tolerable, and break the habit of postponing tolerability until the summer.)  Not all of this is in your control, of course, so another part of writing and working well is, I think, accepting that those plans won’t always pan out.  And leave a margin for catastrophes, great and small.  If your whole writing scheme is contingent on you never getting a flat tire / your kid never getting sick / you never getting called for jury duty / no one you love ever needing you or dying, it probably isn’t going to work for you long-term.

Consider what it’s worth to you.
Because we are all, alas, constrained by the laws of time and space, doing one thing generally means not doing another (or half-doing two things at once).  Try to be cognizant of the trade-offs your writing affords and requires of you.  Be honest about whether the potential rewards actually appeal to you, and your values.  And then consider the costs, and whether they’re acceptable.  With a few exceptions, I am generally fine to sacrifice binge-watching for writing.  And sometimes I feel very okay opting out of being social so I can stay in and work.  But on the other hand, it’s almost never worth it to me – though it used to be – to trade work for sleep, or healthy food, or exercise.  Maybe your non-negotiable stuff is different.  The point is to figure out what that non-negotiable stuff is, and protect it … otherwise work will eat it all.

Detach from the outcome.
Beyond doing your best to make your ideas intelligible and your style engaging, you can’t control how people will respond to your writing.  Consider your audience, but don’t obsess about them, and learn the difference between wanting to connect with your readers and needing to charm and trap them into your ways of seeing and thinking.  Efforts to engineer reader reactions almost never generate better writing, and are much more likely to result in arguments that overreach or result to pedantry, while the fixation with impressing your audiences will ultimately leave you stultified and unable to say much of anything at all.  Good ideas are much easier to come by than magic words.

Look, and move, forward. 
You will have seasons when you are more productive, seasons when you are less productive, and seasons when you are scarcely functional.  Hopefully, over the course of your writing life, these will balance out into an overall sense of accomplishment, with a body of work that bears it out.  When you are more productive, spend some time figuring out what enables you to work at that level, but don’t make yourself crazy trying to recreate it every time you encounter a slump.  Chances are, it’s mostly a matter of circumstance: a legitimate manifestation of your brilliance, sure, but maybe also just good luck. Conversely, the seasons when you are less productive are also likely to be those in which your luck is worse than usual, and not a  final revelation of your incompetence.

Capitalism tells us that time is modular, that any hour has potentially the same value as any other hour, and hence that missed hours can be replaced.  Nope.  If there is something big that keeps you from your work for a season, you won’t (sorry) be able to get those hours back.  And especially if that something big is also something massively unpleasant, you probably won’t be able to stop feeling lousy about those lost hours, anxious or mournful about the work you could be doing, and resentful of the people around you who happen to be enjoying one of those good-luck seasons of magical writing.  In those moments, all you can do is muddle through: do what you can with your radically reduced resources, plead for deadline clemency if you need it, and accept – your overwhelming fatigue may help lubricate this process – that you probably won’t be producing your very best work at this particular godawful juncture.  And don’t compound the insult by blaming yourself for those lost hours, those words left unwritten.  For my part, now that I’m halfway (give or take) back in the saddle after a pretty unrelentingly miserably eighteen months, it’s a daily struggle not to take the losses of that period out on myself.  It takes a lot of mental discipline to focus on what you can do, not on what you didn’t because you couldn’t.

*    *    *    *    *

So that’s a little bit of what I know now that I didn’t know before.  It strikes me as odd that academics, generally so good at questioning why things are the way they are, rarely bring their skeptical sensibilities to the task of questioning their own work habits or the expectations they have internalized.  And for those who are satisfied with their circumstances, there may be no need for this kind of querying.  But I get the impression (or maybe I just run with an exceptionally grumpy crowd) that lots of us are less than satisfied.  Of course, many of the reasons for that are structural, and so insuperable by these tiny little hacks.  But despite this, or maybe because of it, minor adjustments made in the service of your own comfort are meaningful, worth it, and necessary.

logo_square

Unpacking @AcademicsSay: Part 1

N. C. Hall  /  06/05/2016


This is my first blog post.

And the only reason you’re seeing it is @AcademicsSay, ostensibly one of the most influential academic social media accounts reaching upwards of 24 million views a month across platforms.

Although polite company warrants eyes-down, humblebrag explanations of the success of this social experiment as serendipitous, that’s not entirely accurate. Instead, the account growth has been markedly consistent, largely anticipated, and intentionally facilitated by strategies common to influential accounts.

To the extent the following may read as a self-indulgent, overthinking, faux-Machiavellian hyper-justification of writing procrastination, I apologize in advance. Below is Part 1 of a tl;dr overview of the varied growth hacking strategies derived mainly from observation, basic psychology, and trial-and-error that may or may not have contributed to the success of @AcademicsSay.


1.   Opportunity. When I set up my professional Twitter account in May 2013, there was no common gathering point for faculty or lightning rod for feedback/sharing. There were no clear accounts to follow first, nothing central that really got academics excited. I wanted to create that, first because it’s confusing and boring to go online and not have a place to connect with others. Second, I was feeling burnt out and needed a laugh. There were also no humour accounts for faculty, aside from scattered student-shaming efforts and @PhDComics for grad students, so I made one. I am not a humour writer. But you don’t need to be great when there’s no competition; you just need to show up.

2.   Tone. I am not a generally positive person. So when deciding how to sound online, I went with my regularly scheduled deadpan, sarcastic, depressing, uncomfortably self-aware over-explanations that make for awkward conversation. I also pride myself on avoiding the wrath of colleagues by getting a laugh despite my interrupting their work as a way of procrastinating on mine. So the overall tone of @AcademicsSay was basically an extension of what I was already doing, just in a more distilled online format. I then found a recognizable meme that fit the tone and went from there. Fortunately, as non-intellectual or unintentionally humourous aspects of academic content tend to get the most attention on social media (e.g., the “Gabor” effect), I was immediately in business.

3.   Authority. I regularly get comments, questions, and surprisingly impassioned critiques about the account behavior; hopefully this section addresses some of that. In addition to content tone, I incorporated from the outset a set of implicit cues to convey authority to potential followers and expedite follow/retweet decisions. This was for two reasons: first, to provide an ironic take on the stereotypical aloof, egocentric academic persona; second, to mimic the profiles of existing viral parody accounts in the history or science domains. Some examples involving language, formatting, colour, and ratios are below.

4.   Language. The word “shit” in the account name implies irreverence or catharsis and is unexpected in academic timelines, grabbing attention while providing ironic context for otherwise curse-free content. The account handle remained curse-free to accommodate more respectable manual retweets. Similarly, “academics” not “professors” were referred to in the account name to convey faculty responsibilities beyond instruction (e.g., writing, tenure requirements, work-life balance). As the content was to be more “water-cooler gossip” or internal self-talk than in-class “dad jokes,” the less-than-student-centered approach was intentional.

5.   Formatting. Tweet text was formatted to exclude “all caps,” emoticons, exclamation points, and question marks to mitigate impressions of attention-seeking and uncertainty. In addition to facilitating a deadpan or aloof tone, ending sentences with periods was also a bit of a inside academic joke not unlike how Kanye West describes the private hilarity of not smiling. To not dissuade engagement among academics who are typically less than familiar with Twitter protocols, I also initially tried to avoid including nonintuitive hashtags (e.g., #ecrchat) and acronyms (e.g., H/T) in favour of more accessible terminology (e.g., via, courtesy of).

6.   Colour. The colour profile was also intentional. Although the specific profile image (“avi”) was selected almost at random from my cell phone, it needed to satisfy two  conditions: it had to show well at lower resolutions and needed to be red. The colour red was emphasized based on research showing red to implicitly convey competitive success and dominance in affiliative and advertising contexts (e.g., CNN, Time, Science, Netflix, BuzzFeed, TMZ, TED Talks) and to solicit more online engagement (e.g., link clicks) than other colours. The image itself is simply a cropped photo of a graffiti art gorilla I took on the sidewalk after a disappointing trip to the farmer’s market. I’d like to think the gorilla signified other elements (e.g., stoicism, “300-lb gorilla” metaphor), but it’s mainly just red.

7.   Ratios. The account also manipulated three Twitter ratios to implicitly convey authority. First, an exaggerated “following-to-follower” ratio was achieved by not following other accounts (as per other parody accounts) requiring unidirectional follows vs. reciprocal “followbacks.” Second, the “retweet-to-follower” ratio was bolstered by deleting tweets that did not sufficiently resonate; a ratio consistently held to around 0.001. For example, tweets in Spring of 2014 (~30K followers) that did not reach 30 retweets were omitted (typically within an hour), with the exception of tweets including links or promoting content intended for “clickthroughs” (current cut-off is ~150 retweets <1 hour, >1K likes on FB; see @TheTweetOfGod, @SoVeryBritish for comparable ratios). Third, deleting tweets with insufficient retweets helped to improve the “tweet-to-follower” ratio. Off-brand tweets promoting specific accounts, lists, hashtags, sites, etc. were similarly omitted to provide an on-brand, content-focussed read for timeline scrollers (“grooming”). Overall, these ratios were maximized to create the impression of an authoritative, non-reciprocal, content-provider account where each tweet not only resonated but gained substantial followers.

8.   Branding. Similar to other viral parody accounts, @AcademicsSay does not reply or retweet. Instead, standalone text reposted from other accounts is formatted as per a typical academic quotation (“…” – @source), or (more rarely) as a screenshot image, to visually associate or “rebrand” it with the account name and image. The quotation format is immediately recognizable to academics but differs from typical (less visually appealing) manual retweets in which acronyms and the original account are inserted before tweet content (e.g., RT “@source …”). This form of attribution is generally appreciated by those referenced, avoids “Twitter plagiarism,” and facilitates portability across platforms (e.g., Facebook, Tumblr). However, it can also be seen as particularly distasteful (especially screenshots) as it effectively affords self-promotion and metric gains at the expense of direct engagement with source accounts. Given the markedly ego-involving nature of not following someone on Twitter or Facebook, it’s perhaps not surprising that this strategy has to date been the most negatively received.

9.   Images. One of the most well-known and easily implemented ways of increasing Facebook or Twitter engagement is to just add an image (e.g., by 35%). So after waiting three months to ensure that text-based content was resonating with followers (~7K), relevant images were introduced. At this point, I had decided to use the account to recruit for off-line research and consciously opted to forego whatever old-guard, intellectual cache was attached to excluslively sardonic text in favor of incorporating more accessible, existing visual content that elicited a more visceral response (e.g., May 2014: doubling new followers/day to 450+ by doubling down on comics, graphics, and screenshots). Given a long-standing body of work by academic comic legends (e.g., PhD Comics, XKCD) and creative efforts of emerging webcomic artists (e.g., Errant Science, RedPen/BlackPen, The Upturned Microscope), finding content wasn’t hard and I finally had a chance to indulge my long-time love of cartoons. I eventually introduced original images and memes to capitalize on social media norms, mocked up preview graphics to increase clicks for news articles or blogs (16:9 to prevent awkward Twitter cropping, better Facebook previews), and started embedding square blog logos that are automatically grabbed when link is shared.

10.   Attribution. Given the emotional and financial investment involved in creating visual content for social media, I eventually started to receive responses from artists requesting that additional source information be included in posts beyond that contained in the image. And after a few requests by original artists (e.g., @MacLtoons, Kemson Cooper), online criticism when attribution was not included (e.g., Paris attack graphic), and an education on attribution and copyright by my friend Jorge Cham (@PhDComics) following an uncomfortable Twitter/email exchange with artist @twisteddoodles, I not only research the origins of posted artwork (e.g., TinEye, Karma Decay, Veracity) but try to provide linkbacks to within-platform accounts or external sites to not deprive artists of potential exposure or income. Although posting images without attribution or linkbacks is more efficient (particularly when source/contact info is embedded), a well-worn strategy for expediting growth (see @HistoryInPics, IFLScience), and not unpermitted in the Twitter TOS (see p. 22, Agence France Presse v. Morel), it is more susceptible to removal on Facebook or Twitter (DCMA takedowns) on copyright grounds and is not a good look for an academic audience uncommonly preoccupied with attribution.

11.   Anonymity. I ran the account anonymously until July 2015 for various reasons. First, I didn’t want my atypical online activities to somehow influence my tenure deliberations. It also helped to maintain a focus on the followers, underscoring the aim of the account to resonate based on shared experiences rather than a self-indulgent showcase of intellectual, writing, or humour abilities. In this way, followers were allowed to perceive their engagement more simply as sharing a laugh or connecting with others by way of satire, as opposed to endorsing the attention-seeking efforts of a specific individual. This decision also helped to circumvent the awkward self-esteem-loaded “followback” expectation otherwise encountered with personal Twitter accounts. In a similar vein, demographic cues involving nationality (e.g., American spelling), gender (typically assumed female), race, rank, or discipline that could unnecessarily complicate or bias content perception and mitigate engagement were avoided. As an anonymous account, I was also allowed more freedom to make mistakes and experiment in term of content (e.g., topics, attribution) or growth strategies (e.g., branding, promotion) without risk of direct criticism or reprisal.

Maybe it’s because academics tend to be familiar with blinded research and manuscript reviews that remarkably few people ever asked who I was. Or maybe it’s that social media platforms generally promote engagement over attribution, a point illustrated by Twitter adding the “quote tweet” function in 2015 while at the same time quietly removing the automatic insertion of quotation marks and account mention (used for manual retweets) when copying tweets in the app (making it much easier to plagiarize). Regardless, it was only after my tenure was confirmed, account influence exceeded relevant benchmarks, the cache of “coming out” could be reliably predicted to bolster off-platform efforts (study recruitment), and these unconventional online activities could be justified in part as a public service to non-social-media users that I wrote the Chronicle piece about the account (as agreed upon one year earlier). However, judging by continued confessions of love for “whoever you are” or “you guys,” and minimal spillover to my personal Twitter account, people generally don’t seem to notice or care who’s running the account.

12.   Efficiency. To promote initial growth, I also pre-prepared tweets that released automatically on apps like Buffer (Facebook pages provide in-platform scheduling) and used free sites like Tweriod to determine optimal tweet times (now largely irrelevant due to international reach). Not unlike other parody or satire accounts, I also regularly repeat content. Although I had previously deleted original tweets to disguise this strategy (some accounts delete tweets wholesale, presumably for the same reason), I now keep them up to gauge growth. I initially felt comfortable repeating only after a 6 month lag (consistent with previous Twitter API restrictions preventing older tweets from being viewed), but now tend to repost within 2-3 months due to a follower base big enough to ensure sufficient sharing from those who would not have seen it, would not remember seeing it, or would not mind seeing it again. Although some repeats are verbatim, others are reformatted or modified (e.g., replacing “book” with “blog” 9 days later) to improve engagement. As for the account meme, the “shit xxx say” format itself affords specific efficiencies, such as a focus on what others say (observation is much easier than inspiration) and basic text (Siri dictation while waiting at Starbucks vs. curated content or creating visuals), as demonstrated by even single-letter posts gaining traction. Finally, one unanticipated consequence of this meme is the extent to which it actually encouraged crowdsourced feedback (replies, mentions, emails) that has to date been highly effective in terms of providing off-platform content, pop culture phrases (e.g., “all of the things”, “Netflix and chill”), timely memes (e.g., Game of Thrones), or even grammatical improvements for repeat posts.


So there you go: a quick introduction to some of the more straightforward strategies adopted a priori or over time to expedite follow decisions and account growth for @AcademicsSay. For more on the roles of analytics, experimentation, and emotions, or more awkward topics such as plagiarism, haters, and monetization, check back for Parts 2 and 3 in the coming days.


logo_square

 

How to Not be Boring on Academic Social Media

Author: @TheLitCritGuy
Original: TheLitCritGuy.com


For many academics it may seem that the rise of social media is yet another means of potential procrastination. Yet increasingly, certain academics have turned to social media not just as a way of accessing entertainment or as a tool for networking but as a means of engaging audiences in a brand new way.

Perhaps the most famous and well-known is @NeinQuarterly, an anonymous account that blends aphorisms, jokes and an expert level knowledge of German literature and culture to produce a fascinating and hugely popular account. Started by a former professor of German literature, @NeinQuarterly’s unique aphoristic and satirical style now appears in print in German and Dutch newspapers and last year saw the publication of Nein: A Manifesto, a book collecting his finest material that’s been published in multiple languages. On YouTube there is aside from John and Hank Green’s famous ‘Crash Course,’ PhilosophyTube, an account started from nothing just a few years ago that now has around 60,000 subscribers following their videos on Masters level philosophy.

Personally, my own anonymous account started for far less career-minded reasons. Having finished my Master’s degree and with a twitter account that I didn’t really use, I decided to dedicate it to talking about the thinkers and ideas that had intrigued me during Masters study and provoked me into applying for a PhD. I decided to cover literary theorists and critics who had been only briefly touched upon during my undergraduate degree. However, after starting the account I was convinced it would be largely ignored yet after tweeting to a few more widely followed accounts it picked up a surprising number of engaged and highly curious followers. Almost immediately, issues such as a posting schedule, what to talk about, and even the limits of my own knowledge became something that had to be dealt with. With a vocal and supportive group of followers I was forced to honest about my own limitations, my own inexperience, and allow myself to discover the liberating freedom of telling followers that I don’t know; that I would love to know more about something (something almost unthinkable in the high pressure environment of PhD research). The pressures of normal life meant that often the account became deeply personal as well as something academic and this seemed to only further the connection between me and the great groups of people who followed the account.

On top of this, anonymity comes with certain benefits that using social media with a name and a face doesn’t carry. From behind the “persona” of TheLitCritGuy my opinions don’t need to be run against what my institution or its managers might deem to be acceptable. Anonymity also allows the freedom for a kind of character to emerge. Behind anonymity, anger at the conditions of higher education for ECRs and students can be expressed more forcefully, and I also get to mash up jokes with theory without worrying colleagues will take me less seriously.

For academics who wish to take to social media and use it in a way beyond networking or sharing cat videos there is no sure fire way of doing things, but in the course of my own experiment there are a few things that I’ve found to have worked.

Firstly, have a distinctive voice. Anonymous accounts do not necessarily have a name or a face, but they depend upon having a distinctive perspective to offer. From Twitter the pseudonymous accounts @EthicistForHire and @CrankyEthicist from the name alone, immediately offers potential followers an insight into their account and what they are like.

Secondly, have a purpose. One of the most successful anonymous accounts in #AcademicTwitter, @AcademicsSay posts collections of jokes that connect really strongly with academics – jokes about coffee, about being overworked and the ever present catchphrase that ‘you should be writing.’ These highly sharable posts always keep the account highly focused and with a clear sense of purpose allowing it to grow to being followed by hundreds of thousands of people.

Thirdly, find your audience. Rather than just post into the void, the best academic accounts use the tools of social media to find an interested audience. Most notably, there are hashtags like #twitterstorians, where historians post and organise their thoughts, allowing an audience who want to engage with historians to find them. I always try and organise my own posting under #TheoryTime, allowing followers to keep up with what I’m talking about and catch up on topics they may have missed.

Fourth, expand. Whilst my own twitter account was successful, I quickly encountered the limitations of the form. I decided to expand my account into a research blog, as well as using the platform I built on twitter to write on new websites, bringing @TheLitCritGuy to a much wider audience.

Finally, connect. Whilst people follow an account or watch a YouTube channel to gain knowledge, using social media allows for academia to become more personally relatable – rather than a hierarchy of a teacher with students, twitter becomes a space of conversation and mutual education. Whilst I try and keep the important details of my life private from my account, a few personal details, personal opinions, and replies to followers makes the account more vibrant, more interesting and much more fun for those following.

It is this that makes anonymous accounts so effective too – outside of the structures, rules and roles of university networking, the anonymous account can become a place where academic researchers get to connect directly with an audience. Impact becomes something more than just a metric as people get to connect with academics beyond the realm of university organised public engagement events. Furthermore, this use of social media allows the public to see what life as an academic can be like, in all of its good and bad points.

Behind the anonymity of a nameless, faceless account I’ve shared some of the struggles of being an early career researcher, news about the state of the wider UK HE environment and the sheer joy of teaching as well as sharing and talking about my own research and intellectual passions. Whilst anonymous accounts bring a certain degree of freedom, there is the pressing awareness that my account won’t necessarily benefit my career within the university system. However, as more academics take to social media, using anonymous accounts allows for a new kind of creative, flexible academic to emerge, more closely linked with the public rather than embedded within the ivory towers of the university system.

I’ve received countless tweets, Facebook messages, and emails from people across the world, who, through various pressures felt they couldn’t pursue their own passion for literature and theory – needing a job, or dealing with their children they feel like they’ve missed out on a swathe of knowledge and it’s a genuine privilege to answer the questions and learn from them. Whether it be emailing economists about Foucault or letting a nursing student know more about phenomenology using social media has shown me that beyond the limits of the university classroom, people are curious and searching for new ways to be engaged and to learn. Social media can change how we teach and spread knowledge beyond the limits of the university and through anonymity academics might well find the freedom to connect with the public like never before.

logo_square

Who Do You Think You Are – Galen Strawson and Life Online

Author: @TheLitCritGuy
Original: TheLitCritGuy.com


One of the most often repeated complaints and criticisms around literary theory is that it lapses frequently into obscurantism and obfuscation. Whilst this is nothing but deeply unfair and inaccurate it has to be acknowledged that there is a great deal of theory that it often difficult to apply to the realities of modern life.  The effort of applying the abstract and removed language of the academie to the mundane details of existence is a hermeneutical exercise that we don’t always have the time or the energy to do.

This doesn’t mean that theory is irrelevant as how we construct and understand our lives are questions that theoretical writing directly concerns itself with – issues of identity, consciousness and perception are all areas that theorists have sought to understand. These complex issues are further problematized when one examines the shift in how the self finds cultural and social expression. It used to be that the predominate mode that this occurred in was face to face. We understood ourselves in the context of relationships, be they professional, familial or social. With the rise of technology and the now ubiquitous ‘social media’ that web of relationships has shifted online.

We have friends.

We have followers.

We get likes, RT’s and re-blogs.

Essentially, things have changed. Before I go any further this isn’t a plea for a return to a more idealistic and less technology driven social experience. The two modes of existence both share the same prevailing ideological model of how the individual understands themselves. We, speaking generally here, make sense of ourselves by constructing a narrative – one of the things that social media has done is make this process more obvious. One only has to look at facebook timelines to see the explicit construction of your subjectivity, your life as a coherent narrative, designed to make us look our very best.

To quote Dan Dennett;

 ‘We are all virtuoso novelists…We try to make all of our material cohere into a single good story. And that story is our autobiography. The chief fictional character…of that autobiography is one’s self’

Contained within the quote are two inter-related theses, which the great analytic philosopher and theorist Galen Strawson identified as the ‘Psychological Narrative Thesis’ and the ‘ethical Narrative Thesis.’

Let me explain – the Psychological Thesis is a descriptive and empirical argument about how we see the world, a way of understanding life that is integral to human nature.  The ‘Ethical Narrative Thesis’ is an argument coupled to the first which posits a narrative understanding of life – that having or conceiving one’s life in a narrative sense is necessary or essential  for developing true or full personhood.

Now, one can think that these two interrelated ideas are some combination of true or false but it’s worth examining how these two lines of argument operate online. The desire for narrative reflects our desire for coherence – we want desperately for the things we encounter online to make sense, to cohere in some way so it should come as no surprise that is how we treat others online.

The majority of the time this isn’t really an issue and one of the upsides of online culture is that it tends to treat people as whole and cohesive individuals. Basically, viewing people through the lens of a Narrative works out quite well most of the time – it allows us to make quick and generally fairly reliable judgements about the other and present ourselves in such a way that we can be easily comprehended too.

However, there is an issue here – the narrative thesis is a totalising one, a structuralist way of viewing the world and each other. The vast majority of the time it may be sufficient to view ourselves online as a seamless cohesive whole that tells a singular narrative story but this quickly runs into a problem – diachronic consistency.

To explain that in less technical sounding words, the idea that persistent through time is a recognizable thread of consciousness within one individual just doesn’t hold up. It is not the disconnection within online life that irks, but the flawed drive for all of this to make sense, for all of our lives to be tied together in one neat package. We become authors who edit on the fly, making ourselves the neatest and tidiest selves we can be, desperate to excise the disparate and the different and the dysfunctional.

This isn’t a new problem – to quote the great Virginia Woolf;

Look within and life, it seems, is very far from being “like this”. Examine for a moment an ordinary mind on an ordinary day. The mind receives a myriad impressions — trivial, fantastic, evanescent, or engraved with the sharpness of steel. From all sides they come, an incessant shower of innumerable atoms; and as they fall, as they shape themselves into the life of Monday or Tuesday, the accent falls differently from of old…Life is not a series of gig lamps symmetrically arranged; life is a luminous halo, a semi-transparent envelope surrounding us from the beginning of consciousness to the end.

Viewing these neat and tidy profiles, those expertly curated twitter streams and Woolf’s quote takes on fresh resonance. Life, indeed, does not seem to be like this. If social media and internet living is where we will all increasingly be it must become a place where the honest expression of the many different internal selves can find a place. Perhaps we need less narrative – less desire to be a coherent singular story that others *like* and more spaces where the individual can change, be contradictory and experience anew.

logo_square