Who Do You Think You Are – Galen Strawson and Life Online

Author: @TheLitCritGuy
Original: TheLitCritGuy.com


One of the most often repeated complaints and criticisms around literary theory is that it lapses frequently into obscurantism and obfuscation. Whilst this is nothing but deeply unfair and inaccurate it has to be acknowledged that there is a great deal of theory that it often difficult to apply to the realities of modern life.  The effort of applying the abstract and removed language of the academie to the mundane details of existence is a hermeneutical exercise that we don’t always have the time or the energy to do.

This doesn’t mean that theory is irrelevant as how we construct and understand our lives are questions that theoretical writing directly concerns itself with – issues of identity, consciousness and perception are all areas that theorists have sought to understand. These complex issues are further problematized when one examines the shift in how the self finds cultural and social expression. It used to be that the predominate mode that this occurred in was face to face. We understood ourselves in the context of relationships, be they professional, familial or social. With the rise of technology and the now ubiquitous ‘social media’ that web of relationships has shifted online.

We have friends.

We have followers.

We get likes, RT’s and re-blogs.

Essentially, things have changed. Before I go any further this isn’t a plea for a return to a more idealistic and less technology driven social experience. The two modes of existence both share the same prevailing ideological model of how the individual understands themselves. We, speaking generally here, make sense of ourselves by constructing a narrative – one of the things that social media has done is make this process more obvious. One only has to look at facebook timelines to see the explicit construction of your subjectivity, your life as a coherent narrative, designed to make us look our very best.

To quote Dan Dennett;

 ‘We are all virtuoso novelists…We try to make all of our material cohere into a single good story. And that story is our autobiography. The chief fictional character…of that autobiography is one’s self’

Contained within the quote are two inter-related theses, which the great analytic philosopher and theorist Galen Strawson identified as the ‘Psychological Narrative Thesis’ and the ‘ethical Narrative Thesis.’

Let me explain – the Psychological Thesis is a descriptive and empirical argument about how we see the world, a way of understanding life that is integral to human nature.  The ‘Ethical Narrative Thesis’ is an argument coupled to the first which posits a narrative understanding of life – that having or conceiving one’s life in a narrative sense is necessary or essential  for developing true or full personhood.

Now, one can think that these two interrelated ideas are some combination of true or false but it’s worth examining how these two lines of argument operate online. The desire for narrative reflects our desire for coherence – we want desperately for the things we encounter online to make sense, to cohere in some way so it should come as no surprise that is how we treat others online.

The majority of the time this isn’t really an issue and one of the upsides of online culture is that it tends to treat people as whole and cohesive individuals. Basically, viewing people through the lens of a Narrative works out quite well most of the time – it allows us to make quick and generally fairly reliable judgements about the other and present ourselves in such a way that we can be easily comprehended too.

However, there is an issue here – the narrative thesis is a totalising one, a structuralist way of viewing the world and each other. The vast majority of the time it may be sufficient to view ourselves online as a seamless cohesive whole that tells a singular narrative story but this quickly runs into a problem – diachronic consistency.

To explain that in less technical sounding words, the idea that persistent through time is a recognizable thread of consciousness within one individual just doesn’t hold up. It is not the disconnection within online life that irks, but the flawed drive for all of this to make sense, for all of our lives to be tied together in one neat package. We become authors who edit on the fly, making ourselves the neatest and tidiest selves we can be, desperate to excise the disparate and the different and the dysfunctional.

This isn’t a new problem – to quote the great Virginia Woolf;

Look within and life, it seems, is very far from being “like this”. Examine for a moment an ordinary mind on an ordinary day. The mind receives a myriad impressions — trivial, fantastic, evanescent, or engraved with the sharpness of steel. From all sides they come, an incessant shower of innumerable atoms; and as they fall, as they shape themselves into the life of Monday or Tuesday, the accent falls differently from of old…Life is not a series of gig lamps symmetrically arranged; life is a luminous halo, a semi-transparent envelope surrounding us from the beginning of consciousness to the end.

Viewing these neat and tidy profiles, those expertly curated twitter streams and Woolf’s quote takes on fresh resonance. Life, indeed, does not seem to be like this. If social media and internet living is where we will all increasingly be it must become a place where the honest expression of the many different internal selves can find a place. Perhaps we need less narrative – less desire to be a coherent singular story that others *like* and more spaces where the individual can change, be contradictory and experience anew.

logo_square

The Valley of Shit

Author: Inger Mewburn
Original: Thesis Whisperer


I have a friend, let’s call him Dave, who is doing his PhD at the moment.

I admire Dave for several reasons. Although he is a full time academic with a young family, Dave talks about his PhD as just one job among many. Rather than moan about not having enough time, Dave looks for creative time management solutions. Despite the numerous demands on him, Dave is a generous colleague. He willingly listens to my work problems over coffee and always has an interesting suggestion or two. His resolute cheerfulness and ‘can do’ attitude is an antidote to the culture of complaint which seems, at times, to pervade academia.

I was therefore surprised when, for no apparent reason, Dave started talking negatively about his PhD and his ability to finish on time. All of a sudden he seemed to lose confidence in himself, his topic and the quality of the work he had done.

Dave is not the only person who seems to be experiencing these feelings lately. I have another friend, let’s call him Andrew.

Andrew is doing his PhD at a prestigious university and has been given an equally prestigious scholarship. Like Dave, Andrew approaches his PhD as another job, applying the many time management skills he had learned in his previous career. He has turned out an impressive number of papers, much to the delight of his supervisors.

Again I was shocked when Andrew emailed me to say he was going to quit. He claimed everything he did was no good and it took a number of intense phone calls to convince him to carry on.

Both these students were trapped in a phase PhD study I have started to call “The Valley of Shit”.

The Valley of Shit is that period of your PhD, however brief, when you lose perspective and therefore confidence and belief in yourself. There are a few signs you are entering into the Valley of Shit. You can start to think your whole project is misconceived or that you do not have the ability to do it justice. Or you might seriously question if what you have done is good enough and start feeling like everything you have discovered is obvious, boring and unimportant. As you walk deeper into the Valley of Shit it becomes more and more difficult to work and you start seriously entertaining thoughts of quitting.

I call this state of mind the Valley of Shit because you need to remember you are merely passing through it, not stuck there forever. Valleys lead to somewhere else – if you can but walk for long enough. Unfortunately the Valley of Shit can feel endless because you are surrounded by towering walls of brown stuff which block your view of the beautiful landscape beyond.

The Valley of Shit is a terrible place to be because, well, not to put too fine a point on it – it smells. No one else can (or really wants to) be down there, walking with you. You have the Valley of Shit all to yourself. This is why, no matter how many reassuring things people say, it can be hard to believe that the Valley of Shit actually does have an end. In fact, sometimes those reassuring words can only make the Valley of Shit more oppressive.

The problem with being a PhD student is you are likely to have been a star student all your life. Your family, friends and colleagues know this about you. Their confidence in you is real – and well founded. While rationally you know they are right, their optimism and soothing ‘you can do it’ mantras can start to feel like extra pressure rather than encouragement.

I feel like I have spent more than my fair share of time in the Valley of Shit. I was Thesis Whisperering while I was doing my PhD – so you can imagine the pressure I felt to succeed. An inability to deliver a good thesis, on time, would be a sign of my professional incompetence on so many levels. The Valley of Shit would start to rise up around me whenever I starting second guessing myself. The internal monologue went something like this:

“My supervisor, friends and family say I can do it – but how do they really KNOW? What if I disappoint all these people who have such faith in me? What will they think of me then?”

Happily, all my fears were groundless. My friends, teachers and family were right: I did have it in me. But boy – the smell of all those days walking in the Valley of Shit stay with you.

So I don’t want to offer you any empty words of comfort. The only advice I have is: just have to keep walking. By which I mean just keep writing, doing experiments, analysis or whatever – even if you don’t believe there is any point to it. Remember that you are probably not the right person to judge the value of your project or your competence right now.

Try not to get angry at people who try to cheer you on; they are only trying to help. Although you are alone in the Valley of Shit there is no need to be lonely – find a fellow traveller or two and have a good whinge if that helps. But beware of indulging in this kind of ‘troubles talk’ too much lest you start to feel like a victim.

Maybe try to laugh at it just a little.

You may be one of the lucky ones who only experience the Valley of Shit once in your PhD, or you might be unlucky and find yourself there repeatedly, as I did. I can completely understand those people who give up before they reach the end of the Valley of Shit – but I think it’s a pity. Eventually it has to end because the university won’t let you do your PhD forever. Even if you never do walk out the other side, one day you will just hand the thing in and hope for the best.

logo_square

The Productivity Robbing Myths of Grad School

Author: Steve Shaw
Original: How Not To Suck at Grad School


I am not sure if there is a best way to be efficient and productive as there are many very different, but positive, ways to work. However, there are some common and universally terrible ways to work. Here are a few things that I hear students say with pride that are actually signs of an inefficient worker.

“I do my best work at the last minute. I thrive under pressure.”

–No. The first draft of everything is terrible, even for the best writer. You may be an extremely good binge writer, but I promise that the work will be better with another draft and some time to consider and change content.  Plan your time well. The draft of any project should be completed three days to two weeks before it is due. The remainder of the time can be spent in the real work of writing: editing.

“I am not a detail person. I am an idea person.”

–Ideas that are well-researched, communicated in detail, completely thought out, and effectively implemented are useful. All others tend to be vague dreams that borderline on hallucinations. Everyone is a dreamer, but the truly useful person works hard and uses detail to convert dreams into reality.

“I am a perfectionist.”

–This is not a positive trait. Trying to pursue perfection is a useless activity that is harmful to well-being and productivity. Being conscientious, detail focused, and striving for excellence are laudable characteristics. Perfectionism is maladaptive.

When I hear people tell me that they are a perfectionist, I feel the need to assess further to determine if we simply are defining perfectionism differently or if their behavior is maladaptive. Usually people mean that they are detail focused and striving for excellence with undertones of anxiety. This is typically a good set of characteristics for grad students. But when they mention the need to be perfect, then we are into a zone where anxiety may be maladaptive. Seeking excellence is good. Seeking perfection is a neurotic waste of time.

“I edit while I write.”

–This is a guaranteed method of getting nothing finished or severely limiting your productivity. Get all of your ideas out on paper. Only edit when you have completed a document or at least a substantial portion. Editing while writing is slow, makes for choppy prose, reduced flow and creativity, and increases anxiety. People with this habit also tend to be perfectionists and have learned this habit while doing last minute work. Take the time to complete a full draft and then edit.

“I don’t want to show this to you until it is ready.”

–I understand this secrecy problem. Some supervisors are extremely judgmental and even hostile to unfinished work. Submitting any work is aversive under these conditions. The best approach is to have students submit work on a timed basis, even if it is raw. The difference between a professional and an amateur writer is deadlines. Working to a deadline is more important than achieving the mythic ideal paper. I also find that when students wait to submit their ideal paper that they are crushed when substantial revisions are to be made. The supervisor can make suggestions, edits, improve the paper and move on without judgment. The goal is to develop a relationship that produces a large amount of scholarly material in an efficient manner. Trust between a student and supervisor is the best way to make this happen. When the secrecy issue is fostered we are teaching grad students to be perfectionists and adding anxiety to their lives.

“I’m a multi-tasker.”

–You are not. You can only attend to one task at a time. Many folks have developed a sophisticated skill set where they actively shift attention from one task to another. You attend to the television for a few minutes and then back to your book—you cannot do both at the same time. That counts for radio or music as well. You can focus on music or focus on your work, not both. What we tend to do is shift attentional focus. If you are listening to music and you know what was playing and enjoyed it, then you are shifting focus. Once you are in an activity where you are shifting focus between two things, then your efficiency is being robbed. There is some evidence that music with a constant beat and no lyrics can actually aid in concentration and focus. Classical music is an example. When I am at my most scattered, I listen to a metronome to help with focus. But no one is truly multitasking, you are rapidly shifting attention and reducing efficiency. This is not necessarily bad, but inefficient and needs to be used sparingly.

My wife works from home with the TV on.  She says that she likes the noise while she works. However, when I ask her what she is watching on television, she has no idea. She is certainly losing some focus, but not as much as she would if she was at all attending to the TV. I watch television while working only on weekends. I am mostly watching TV, but get a little work done at commercials. Not efficient and focused work, but better than nothing.

White noise can be a better idea than music or TV. White noise can be ideal for folks who like a level of sound to mask the often jarring ambient noise of your real environment such as construction, lawn maintenance, and loud neighbors. There are several white noise generators available online such as http://mynoise.net/NoiseMachines/whiteNoiseGenerator.php and http://simplynoise.com/ . One of my favourite websites and apps is http://www.coffitivity.com/. This site plays the ambient noise from a coffee shop. You can even select the type of coffee shop noise from “morning murmur” to “lunchtime lounge” to “university undertones.” This style of white noise is also helpful for the folks who actually prefer to do creative work in coffee shops, but cannot get there. I do not understand how people do this as my attention flits to the homeless guy, the hostile person in a long line, and the sounds of coffee slurpers; nonetheless many people do their creative work in coffee shops. The white noise from coffitivity is associated with a place of creativity, which can put you in the mood to work. The secret of white noise is that there is no content in the noise to draw attention away from your work.

Once I learned the skill of unitasking, I became at least twice as efficient as before. Now I do one thing fully focused until completed and then turn my attention to the next task. Not only is my work completed at a faster pace as a unitasker; I enjoy movies, TV, and music much more. And as an extra bonus, there are not the nagging feelings of guilt that go along with such multitasking.

We all develop work habits and there are many ways to be a productive worker. But as grad students and professors have increased pressures to produce the limits of our work habits are often reached and exceeded. What worked as an undergrad no longer works and now falls under the heading of a maladaptive habit. There is a constant need to hone work habits and remove of the productivity robbing myths and habits from your work.

logo_square

Giving Up On Academic Stardom

Author: Eric Grollman
Original: Conditionally Accepted


I have bought into the ego-driven status game in academia. Hard. I find myself sometimes wondering more about opportunities to advance my reputation, status, name, and scholarship than about creating new knowledge and empowering disadvantaged communities. Decision-making in my research often entails asking what will yield the most publications, in the highest status journals with the quickest turnaround in peer-review. I often compare my CV to others’, wondering how to achieve what they have that I have not, and feeling smug about achieving things that haven’t. Rarely do I ask how to become a better researcher, but often ask how to become a more popular researcher.

I have drunk the Kool-Aid, and it is making me sick. Literally. The obsession with becoming an academic rockstar fuels my anxiety. I fixate on what is next, ignore the present, and do a horrible job of celebrating past achievements and victories. I struggle to accept “acceptable.” I feel compelled to exceed expectations; I take pride when I do. “Wow, only six years in grad school?” “Two publications in your first year on the tenure track?! And, you’re at a liberal arts college?”

When did I become this way? Sure, academia is not totally to blame. My parents expected me to surpass them in education (they have master’s degrees!). I also suffer, as many gay men do, with the desire to excel to gain family approval, which is partially lost upon coming out. Excelling in college, rather than becoming an HIV-positive drug addict, helped my parents to accept my queer identity. In general, I compensate professionally and socially for my publicly known sexual orientation. It is hard to unlearn the fear one will not be loved or accepted, especially when homophobes remind you that fear is a matter of survival.

Oh, but academia. You turned this achievement-oriented boy into an anxious wreck of a man. It is not simply a bonus to be an academic rockstar of sorts. My job security actually depends on it. And, it was necessary to be exceptional to even get this job. And, it matters in other ways that indirectly affect my job security, and my status in general. You can forget being elected into leadership positions in your discipline if no one knows you. “Who?” eyes say as they read your name tag at conferences before averting their gaze to avoid interacting. I have learned from my critics that one must be an established scholar before you can advocate for change in academia.

The Consequences Of Striving For Academic Stardom

I am giving up on my dream to become the Lady Gaga of sociology. I have to do so for my health. I have to stop comparing myself to other scholars because so many things vary, making it nearly impossible to find a truly fair comparison. Of course, I will never become the publication powerhouse of an Ivy League man professor whose wife is a homemaker. Even with that example, I simply do not know enough about another person’s life, goals, and values to make a comparison. I do not want others to compare themselves to me because my level of productivity also entails Generalized Anxiety Disorder. I am not a good model, either!

Dreams of academic stardom prevent me from appreciating my present circumstances, which were not handed to me. Sadly, voices, which sound awfully similar to my dissertation committees’, have repeatedly asked, “are you surrreeee you don’t want to be at an R1?” I have zero interest in leaving, and negative interest (if that is possible) in enduring the job market again. But, I fear that, as I was warned, I will become professionally irrelevant; and, this has made it difficult to fully appreciate where I am. I have acknowledged the reality that no place will be perfect for an outspoken gay Black intellectual activist. But, I have found a great place that holds promise for even better.

Beyond my health, the lure of academic stardom detracts from what is most important to me: making a difference in the world. Impact factors, citation rates, and the number of publications that I amass distract from impact in the world and accessibility. It is incredibly selfish, or at least self-serving, to focus more energy on advancing my own career rather than advancing my own communities.

Obsession with academic rockstardom forced me to view colleagues in my field as competition. My goal is to demonstrate what I do is better than them in my research. In doing so, I fail to see how we can collaborate directly on projects, or at least as a chorus of voices on a particular social problem. Yet, in reality, no individual’s work can make a difference alone. I also fail to appreciate the great things my colleagues accomplish when I view it only through jealous eyes.

When I die, I do not want one of my regrets to be that I worked too hard, or did not live authentically, or did not prioritize my health and happiness as much as I did my job.  Ok, end of rant.

logo_square

The Lie Guy

Author:
Original: Chronicle of Higher Education


You’d think I’d get used to being called a liar. After all, I’ve written a candid, semiautobiographical novel about being a scam artist, been interviewed in the media about my former life of lying, cheating, and drinking, even edited a prominent philosophical collection on deception. But when a colleague recently ridiculed me about being known as a liar, my feelings were hurt. I have a new life. I’ve been clean and sober and “rigorously honest” (as we say in AA) for two years. Still, to tell you the truth (honestly!), I earned my reputation fair and square.

In the Internet age, a sordid past is a matter of very public rec­ord—for that matter, of public exaggeration—and if you write fiction and memoir about your worst days, as I did (and continue to do), even your students will take the time to read the racy parts (or at least excerpts in online interviews of the racy parts, or YouTube interviews about the racy parts).

God bless and keep tenure—I’d probably hesitate to be frank in this essay without it—although, to be fair to my institution, the ignominious stories about me and my novel were out before my committee granted me tenure. “It takes an odd person to work on lying,” my late mentor (and friend and co-author), the philosopher Robert C. Solomon, once told me, himself having written one or two of the best papers on the subject.

When I was 26 years old, in 1993, I dropped out of grad school at the University of Texas at Austin—I was on a fellowship, staring day after day at my stalled dissertation among stacks of books and papers from the Kierkegaard Archive in the Royal Library in Copenhagen—to go into the luxury-jewelry business. I decided to burn all of my bridges. I didn’t fill out any forms. I didn’t have the ordinary courtesy even to contact my two dissertation directors, Solomon and Louis H. Mackey. I just vanished.

I told myself that it was a conscious strategy, to prevent myself from going back, but I also knew the truth: that I was simply too ashamed to tell them that I had gone into business for the money. Like many of our deceptions, mine was motivated by cowardice: “Tell the people what they want to hear,” or, if you can’t do that, simply don’t tell them anything at all.

A few years later, my next-door neighbor (my wife and I had just moved in) caught me in the driveway and asked, “Hey, Clancy. Did you go to grad school at the University of Texas?”

“I did, that’s right.” I was already uncomfortable. I opened the door of my convertible. The Texas summer sun frowned cruelly down on me.

“I’m an editor of Bob Solomon’s. He told me to say hello.”

Busted. This was Solomon’s way of calling me on my b.s. It was his personal and philosophical motto, adopted from Sartre: “No excuses!” Take responsibility for your actions. Above all, avoid bad faith. Look at yourself in the mirror and accept—if possible, embrace—the person that you are.

But I was on my way to work, and Bob Solomon, at that point in my life, was the least of my problems. I had him stored neatly in the mental safety-deposit box of “people I had not lied to but had betrayed in a related way.”

The jewelry business—like many other businesses, especially those that depend on selling—lends itself to lies. It’s hard to make money selling used Rolexes as what they are, but if you clean one up and make it look new, suddenly there’s a little profit in the deal. Grading diamonds is a subjective business, and the better a diamond looks to you when you’re grading it, the more money it’s worth—as long as you can convince your customer that it’s the grade you’re selling it as. Here’s an easy, effective way to do that: First lie to yourself about what grade the diamond is; then you can sincerely tell your customer “the truth” about what it’s worth.

As I would tell my salespeople: If you want to be an expert deceiver, master the art of self-deception. People will believe you when they see that you yourself are deeply convinced. It sounds difficult to do, but in fact it’s easy—we are already experts at lying to ourselves. We believe just what we want to believe. And the customer will help in this process, because she or he wants the diamond—where else can I get such a good deal on such a high-quality stone?—to be of a certain size and quality. At the same time, he or she does not want to pay the price that the actual diamond, were it what you claimed it to be, would cost. The transaction is a collaboration of lies and self-deceptions.

Here’s a quick lesson in selling. You never know when it might come in handy. When I went on the market as a Ph.D., I had six interviews and six fly-backs. That unnaturally high ratio existed not because I was smarter or more prepared than my competition. It was because I was outselling most of them.

Pretend you are selling a piece of jewelry: a useless thing, small, easily lost, that is also grossly expensive. I, your customer, wander into the store. Pretend to be polishing the showcases. Watch to see what is catching my eye. Stand back, let me prowl a bit. I will come back to a piece or two; something will draw me. You see the spark of allure. (All great selling is a form of seduction.) Now make your approach. Take a bracelet from the showcase that is near, but not too near, the piece I am interested in. Admire it; polish it with a gold cloth; comment quietly, appraisingly on it. You’re still ignoring me. Now, almost as though talking to yourself, take the piece I like from the showcase: “Now this is a piece of jewelry. I love this piece.” Suddenly you see me there. “Isn’t this a beautiful thing? The average person wouldn’t even notice this. But if you’re in the business, if you really know what to look for, a piece like this is why people wear fine jewelry. This is what a connoisseur looks for.” (If it’s a gold rope chain, a stainless-steel Rolex, or something else very common and mundane, you’ll have to finesse the line a bit, but you get the idea.)

From there it’s easy: Use the several kinds of lies Aristotle identified in Nicomachean Ethics: A good mixture of subtle flattery, understatement, humorous boastfulness, playful storytelling, and gentle irony will establish that “you’re one of us, and I’m one of you.” We are alike, we are friends, we can trust each other.

The problem is, once lying to your customer as a way of doing business becomes habitual, it reaches into other areas of your business, and then into your personal life. Soon the instrument of pleasing people becomes the goal of pleasing people. For example, who wouldn’t want to buy a high-quality one-carat diamond for just $3,000? (Such a diamond would cost $4,500 to $10,000, retail, depending on where you buy it.) But you can’t make a profit selling that diamond for $3,000—you can’t even buy one wholesale for that amount. Since the customer can’t tell the difference anyway, why not make your profit and please the customer by simply misrepresenting the merchandise? But that’s deceptive trade! There are laws against that! (There’s a body of federal law, in fact: the Uniform Deceptive Trade Practices Act. Texas awards triple damages plus attorney’s fees to the successful plaintiff.) Aren’t you worried about criminal—or at least civil—consequences? And how do you look at yourself in the mirror before you go to bed at night?

During my bleakest days in business, when I felt like taking a Zen monk’s vow of silence so that not a single lie would escape my lips, I often took a long lunch and drove to a campus—Southern Methodist University, Texas Christian University, the University of Texas at Arlington—to see the college kids outside reading books or holding hands or hurrying to class, and to reassure myself that there was a place where life made sense, where people were happy and thinking about something other than profit, where people still believed that truth mattered and were even in pursuit of it. (OK, perhaps I was a bit naïve about academic life.)

I was in the luxury-jewelry business for nearly seven years, and though I don’t believe in the existence of a soul, exactly, I came to understand what people mean when they say you are losing your soul. The lies I told in my business life migrated. Soon I was lying to my wife. The habit of telling people what they wanted to hear became the easiest way to navigate my way through any day. They don’t call it “the cold, hard truth” without reason: Flattering falsehoods are like a big, expensive comforter—as long as the comforter is never pulled off the bed.

It seemed that I could do what I wanted without ever suffering the consequences of my actions, as long as I created the appearance that people wanted to see. It took a lot of intellectual effort. I grew skinnier. I needed more and more cocaine to keep all my lies straight. And then, one morning, I realized that I had been standing in “the executive bathroom” (reserved for my partner and myself) at the marble sink before a large, gilt Venetian mirror every morning for days, with my Glock in my mouth (in the jewelry business, everyone has a handgun). I still remember the oily taste of that barrel. Before I confronted the fact that I was trying to kill myself, I had probably put that gun in my mouth, oh, I don’t know—20, 30 times. I said, “Enough.”

I called Bob Solomon. That was in May of 2000.

I was relieved when he didn’t answer his phone. I left a message: “I’m sorry, Dr. Solomon. I’d like to come back.” Words to that effect, but at much greater length. I think the beep cut me off.

When he called back, I was too frightened to pick up. I listened to his voice-mail message. He said, “Clancy, this is not a good time to make yourself difficult to get ahold of.”

I called again. He let me off easy. (He was perhaps the most generous person I’ve ever known.) I caught him up with the past six years of my life. He told me to call him Bob, not Dr. Solomon: “We’re past that.” Then he said, “So, why do you want to come back?”

“I want to finish what I started, Bob.”

“That’s a lousy reason. Try again.”

“I need to make a living that’s not in business. I hate being a businessman, Bob.”

“So be a lawyer. Be a doctor. You’ll make more money. It’s not easy to get a job as a professor these days, Clancy.”

“It’s the one thing I really enjoyed. Philosophy was the only thing that ever truly interested me. And I have some things I want to figure out.”

“Now you’re talking. Like what? What are you thinking about?”

“Lying. Or failure. I feel like I know a lot about both of them right now.”

(I was writing a long essay about suicide, which, come to think of it, might have been more to the point at the time. But I didn’t want to scare him off.)

A beat.

“Nobody wants to read about failure. It’s too depressing. But lying is interesting. Deception? Or self-deception? Or, I’m guessing, both?”

“Exactly. Both. How they work together.”

With the help of a couple of other professors who remembered me fondly, in the fall semester of 2000, Bob Solomon brought me back to the philosophy doctoral program at Austin, and I started work on a dissertation called “Nietzsche on Deception.” One of the other graduate students—Jessica Berry, now one of philosophy’s best young Nietzsche scholars—called me “the lie guy,” and the moniker stuck.

I went to work on deception not because I wanted to learn how to lie better—I had mastered the art, as far as I was concerned—but because I wanted to cure myself of being a liar. What had started out as a morally pernicious technique had become a character-defining vice. I had to save myself. I needed to understand the knots I had tied myself into before I could begin to untangle them. (It seems like an odd solution now. At the time, I thought I was too smart for therapy.)

It’s an old idea, of course: The Delphic injunction “Know thyself” is an epistemological duty with moral muscle, intended for a therapeutic purpose. Throughout the history of philosophy, until quite recently, it was thought that the practice of philosophy should have a powerful impact on the philosopher’s life—even, ideally, on the lives of others. So I studied deception and self-deception, how they worked together, why they are so common, what harms they might do, and when, in fact, they may be both useful and necessary. Think, for example, about the misrepresentation, evasion, and self-deception involved in falling in love. Who hasn’t asked, when falling in love, “But am I making all this up?” Erving Goffman would have agreed with the joke—I think we owe it to Chris Rock: “When you meet someone new, you aren’t meeting that person, you’re meeting his agent.”

I was lucky: I was awarded my Ph.D. in 2003, and I got a job. Being part of a university as a professor was very different from being a student, even a grad student. Suddenly you have power. In business—especially in retail—the customer has all the power. But students are nothing like customers, although they are starting to act more and more that way, I’ve noticed, and have eagerly adopted the motto “the customer is always right.” My fellow professors wore their power like a crown. They didn’t feel the need to pull a smile out of anyone.

I was still going from classroom to committee room trying to please everyone. I don’t think it harmed me or anyone else, particularly: It was simply unnecessary. As that sank in, I became disoriented. It reminded me of when I was in St. Petersburg, Russia, in the 1990s, trying to hire the world’s best (and most underpaid) jewelers. No one cared about your money. The concept hadn’t yet sunk its teeth into the post-Communist soul. Similarly, in academe, no one paid much attention to the capital—charm—I was accustomed to spending in my daily life.

In fact, charm could even be a hindrance. In my first year, I was asked by a senior colleague to be the research mentor to a philosopher who had been hired around the same time. After talking about my research, my colleague added, “You are mostly who you seem to be.” This from a man who prided himself on being only who he seemed to be—as though we are all only one person!—and as a way of letting me know that he had “seen through me,” that he “was not prey to my charms.” Also, no doubt he was gently letting me know that I didn’t have to pretend to be someone other than I was.

In my old life, everyone was always trying to be more charming than everyone else—even the gruffness of certain wholesalers was (everyone understood) only pretense, the pose of authenticity, the rough exterior that hid the honest, caring heart. To be charming was among the highest virtues.

But now the chair of a science department at my university—a person whom I like very much, and who is enormously charming—and other colleagues often seem suspicious of charm in anyone. Charm is what you expect from administrators, and they, we all know, are not to be trusted. Administrators are just glorified salespeople who can’t publish (so the story goes). A charming student is a dishonest student, an apple polisher.

If I was a bit rude to people, however, if I acted superior, if I had the right mix of intellectual distance and modest moral disdain, I was suddenly a member of the club. I had to be the opposite of eager to please. Other people must be eager to please me. And if they were, I should be suspicious of them. They should be subservient without being (obviously) obsequious. They can flatter, but never as a salesperson flatters; I want flattery only from my equals. This from people who were regularly checking RateMyProfessors.com to see how many hot peppers they’d earned. Or who fretted—or, still worse, pretended not to fret—about their teaching evaluations.

I got Bob Solomon on the phone again.

“Bob, the professor business is even sleazier than the jewelry business. At least in the jewelry business we were honest about being fake. Plus, when I go to conferences, I’ve never seen such pretentiousness. These are the most precious people I’ve ever met.”

“Come on, Clancy. Did you really think people were going to be any better in a university?”

“Um, kind of.” Of course I did. “And it’s not that they’re not better. They’re worse.”

“Well, you may have a point there.” (Bob was always very tough on the profession of being a professor.) “Focus on the students and your writing. The rest of it is b.s.” (That was a favorite expression of Bob’s, as it is of a former colleague of his at Princeton, Harry Frankfurt.)

“With the students, I still feel like I’m selling.” (I was very worried about this.)

“You are selling. That’s part of what it is to be a good teacher.” (Bob was in the university’s Academy of Distinguished Teachers and had won every teaching award in the book. He also made several series of tapes for the Teaching Company.) “To be a good teacher, you have to be part stand-up comic, part door-to-door salesman, part expert, part counselor. Do what feels natural. Be yourself. Are your students liking it? Is it working for you?”

“Yes.” They liked it all right, maybe a bit too much. “And I think they’re learning.”

“Then forget about the rest of it. Just have fun. That’s the best reason for doing it.”

Stendhal wrote: “With me it is a matter of almost instinctive belief that when any … man speaks, he lies—and most especially when he writes.” I still like to tell a good story. But doesn’t everybody who loves teaching? How else are you going to liven up the classroom when students’ eyes are always turning to their iPhones or laptops?

People often ask me now if I miss the jewelry business. My brother and I rode elephants in the mountains of northern Thailand to buy rubies from the miners. I flew to Hong Kong to buy a rope of gigantic black South Sea pearls—each nearly the size of your thumb—and a precious antique jade bracelet from a dying Chinese billionairess, and flew to Paris two days later to sell it to a customer. I walked through the winding, crowded streets of Jerusalem with my diamond wholesaler, talking about the two-state solution. I stayed at the Four Seasons, the Mandarin Oriental, or private mansions of friends. I lived shoulder-to-shoulder with celebrity clients, flew first class, had my suits custom-made, vacationed in Bali or wherever I wanted. More important—thinking of my life today—I didn’t worry about whether my daughters might have to take out student loans.

And the truth is, a lot of the time, that life was fun. The people were rich, noisy, outrageous. When I opened a new store, I felt like I’d created something special.

Would I go back? Do I miss it? No. Sometimes—I write this looking out my office window at the 100-year-old trees outside, their boughs barely lifting and falling in the autumn wind—I feel like a monk who has retreated from a world that was too much for him. “The greatest part of virtue lies in avoiding the opportunity for vice,” St. Augustine teaches us.

Maybe I’m persisting in a kind of self-deceptive naïveté that Bob wouldn’t have approved of, but you could say that my livelihood now depends on telling the truth. Back then I was arms-and-shoulders deep into life, and now at times I feel as though I am only skating on its mirrored surface. But I’d be afraid to go back. I feel peaceful now. It’s less work to be me, and to have me around. I don’t feel the need to lie. Most of the time.

 


Dr. Martin’s new book on deception in romantic relationships entitled “Love and Lies” is now available.

logo

Je Suis Reviewer #2

Author: Ana Todorović
Original: Musings


I was recently invited to review a manuscript for a journal I follow regularly. The content was right along the lines of my kind of research, and I was happy to accept. I was, of course, Reviewer Number Two.

I always have been, in each of my fifteen-ish reviewing experiences. But this was the first time that the drop-down menu actually encouraged me to be your stereotypical Number Two:

I was in the second year of my PhD when that first reviewing assignment landed in my lap. So there I was, sifting through my inbox, deleting the “Dear Dr. Todorovic” flattery of predatory publishers. But then I hesitated at this one e-mail, because apart from the heading, it lacked the usual telltale signs of spam.

An invitation to review.

I forwarded it to my supervisor. “Oh, that’s a good journal – don’t you know it? I’d accept if I were you.” Wow. Pride and chagrin, all rolled into one.

I kept re-reading that manuscript, re-wording my review, postponing the submission. Should I tell the editor I’m a clueless student, and not Dr. Todorovic? Should I say I’ve never done this before, that I don’t know what I’m doing or why they picked me? That it’s all a big mistake?

In the end I said nothing. I pressed the submit button; the world didn’t implode. Two days later, an e-mail arrived. The other reviewer didn’t do their job on time, the editor thought my concerns were substantial enough to request a major revision. I was mortified.

***

It got easier over time and with some experience, but it never really got easy. I still open the report of the other reviewer, the one that knows what they’re doing, with trepidation. If they caught something I should have, I feel ashamed. If their misgivings align with mine, I’m flooded with relief. If I mention something they didn’t, I worry that I was nitpicking.

As Reviewer #2, I get to see plenty of weak papers in low-impact journals, written in broken English, with poorly described experimental procedures and inconclusive results. It’s very annoying when these come from good labs that write up their other papers, the ones I don’t get to review, with care.

I never know how much to judge and how much to help. I never know if helping will be seen as asking them to write the paper I would have written, the thing Reviewer #2 is notorious for. I never know what to do when I need just a few extra pieces of information to understand the design, before I can decide about the rest. I get frustrated when I don’t understand things, and I worry that this frustration will spill over into my review as pointless vitriol. Another feature of #2.

It’s worse when the journals are good. I can judge whether a design is creative and elegant, and can lead to the claimed conclusions. I can judge whether the analyses are sound. In some cases I will even check whether the numbers in the reported statistics all match up (you’re welcome). But when I have to judge novelty? And whether the wow effect matches the scope of the journal? Good grief, how should I know? Ask Reviewer #1, I can barely keep up with my own narrow topic.

***

Most of the learning from that first review onward was (and still is) a lonely process, with only the other reviewer’s comments as any substantial form of feedback. So every time I hear a gripe about Reviewer #2, I cringe a little on the inside. It’s me, it’s me, and I’m trying to be invisible.

I don’t think we should stop grumbling about Reviewer #2, I’m a big fan of complaining. But maybe, just maybe, a little bit of structured guidance would help? Someone to show us how to be kind but decisive. To tell us to always list strong points, then voice our misgivings as suggestions for improvement. To consider whether the experiment is something others would care to know about before we rip it apart.

I had a supervisor who showed me the ropes, but this shouldn’t be left to individual group leaders. We’re all in this together, both causing the damage and taking it. Instead of throwing young researchers into it head-first, maybe we can teach them, and make reviewing a more user-friendly experience.

“Hi, my name is Ana and I will be your reviewer tonight.”

Can’t Disrupt This: Elsevier and the 25.2 Billion Dollar A Year Academic Publishing Business

Author: Jason Schmitt
Original: Medium


Twenty years ago (December 18, 1995), Forbes predicted academic publisher Elsevier’s relevancy and life in the digital age to be short lived. In an article entitled “The internet’s first victim,” journalist John Hayes highlights the technological imperative coming toward the academic publisher’s profit margin with the growing internet culture and said, “Cost-cutting librarians and computer-literate professors are bypassing academic journals — bad news for Elsevier.” After publication of the article, investors seemed to heed Hayes’s rationale for Elsevier’s impeding demise. Elsevier stock fell 7% in two days to $26 a share.

As the smoke settles twenty years later, one of the clear winners on this longitudinal timeline of innovation is the very firm that investors, journalists, and forecasters wrote off early as a casualty to digital evolution: Elsevier. Perhaps to the chagrin of many academics, the publisher has actually not been bruised nor battered. In fact, the publisher’s health is stronger than ever. As of 2015, the academic publishing market that Elsevier leads has an annual revenue of $25.2 billion. According to its 2013 financials Elsevier had a higher percentage of profit than Apple, Inc.

Brian Nosek, a professor at the University of Virginia and director of the Center for Open Science, says, “Academic publishing is the perfect business model to make a lot of money. You have the producer and consumer as the same person: the researcher. And the researcher has no idea how much anything costs.” Nosek finds this whole system is designed to maximize the amount of profit. “I, as the researcher, produce the scholarship and I want it to have the biggest impact possible and so what I care about is the prestige of the journal and how many people read it. Once it is finally accepted, since it is so hard to get acceptances, I am so delighted that I will sign anything — send me a form and I will sign it. I have no idea I have signed over my copyright or what implications that has — nor do I care, because it has no impact on me. The reward is the publication.”

Nosek further explains why researchers are ever supportive by explaining the dedicated loyal customer base mantra, “What do you mean libraries are canceling subscriptions to this? I need this. Are you trying to undermine my research?”

In addition to a steadfast dedication by researchers, the academic publishing market, in its own right, is streamlined, aggressive, and significantly capitalistic. The publishing market is also more diverse than just the face of Elsevier. Johan Rooryck, a professor at Universiteit Leiden, says, “Although Elsevier is the publisher that everybody likes to hate, if you look at Taylor & Francis, Wiley, or Springer they all have the same kind of practices.”

Heather Morrison, a professor in the School of Information Studies at the University of Ottawa, unpacks the business model behind academic publisher Springer and says, “If you look at who owns Springer, these are private equity firms, and they have changed owners about five times in the last decade. Springer was owned by the investment group Candover and Cinven who describe themselves as ‘Europe’s largest buy-out firm.’ These are companies who buy companies to decrease the cost and increase the profits and sell them again in two years. This is to whom we scholars are voluntarily handing our work. Are you going to trust them? This is not the public library of science. This is not your average author voluntarily contributing to the commons. These are people who are in business to make the most profit.”

Should a consumer heed Morrison’s rationale and want to look deeper into academic publishers cost structure for themselves one is met with a unique situation: the pricing lists for journals do not exist. “It’s because they negotiate individually with each institution and they often have non-disclosure agreements with those institutions so they can’t bargain with knowing what others paid,” says Martin Eve, founder of the Open Library of the Humanities.

In addition to a general lack of pricing indexes, the conversation around the value of a publication is further complicated by long-term career worth. David Sundahl, a senior research fellow at the Clayton Christensen Institute for Disruptive Innovation, says, “We actually understand how money passed through to artists who wrote music and authors who wrote books — but it is not clear how the value of a publication in a top tier journal will impact someone’s career. Unlike songs or books where the royalty structure is defined, writing a journal article is not clear and is dependent not on the people who consume the information but rather deans and tenure committees.”

Disruption Doable?

It is precisely the prior lack of a pricing and value barometer that leads to the complexities associated with disrupting the main players in academic publishing. “Adam Smith’s invisible hand works to lower prices and increase productivity but it can only do so when valuation or pricing is known and the same thing is true for disruption. If you don’t know how to value something, you actually don’t have tiers of a market,” says Sundahl.

If a disruptive force was to significantly change academic publishing it needs to happen in a market that is currently underserved or undesirable by the large-scale publisher. “Disruptive innovation is usually driven by a group who can’t afford to build something that is as big, fancy and sophisticated as the existing solution — they then have to find a market where either people don’t have anything available to them or they are satisfied with something less than perfect,” says Sundahl.

Should academic scholarship keep existing in a similar trajectory as in the past decades Sundahl finds incumbents (existing big publishers) almost always win when competition takes place along those sustaining strategy lines. “To revolutionize academic publication, a new system would need to be developed in a basement market which would eventually enable people to gain enough credibility doing this new solution. People would then begin to value this lower end, well done research, and that is when the world starts to change,” says Sundahl.

The prior is exactly what large entities like the Bill and Melinda Gates Foundation or perhaps even top tier research one (R1) universities can’t do. “They have to play the game the way the winners are already playing it. Incumbents almost always win under those conditions,” says Sundahl. And to further complicate matters, junior colleges and community colleges, which perhaps would represent fertile grounds to be served by a newer, “basement market” entrant, may be less likely to spearhead this new outlet themselves due increasing government constraints focused nearly exclusively on job placement and starting salaries in lieu of a research-based, theoretical curriculum.

Open Access Packs a Punch

Driven by the lopsided power structure the move toward open access and the unrestricted access to academic information has been exponentially growing. Perhaps it is, itself, a “basement market” for leveling the academic publication environment and creating a market where respect and credibility can be fostered, grown and transitioned into the existing academic prestige, merit, and tenure conversations.

“The open access environment is one of the more fertile environments for people to be thinking: if we don’t like the old way, what should the new way look like,” says Heather Joseph, executive director at the Scholarly Publishing and Academic Resources Coalition (SPARC). Joseph finds that the quantifiable numbers of open access journals speak for themselves and says, “You can look at the number of strictly open access journals if you look at the Directory of Open Access Journals (DOAJ). When it started tracking open access journals there were a few dozen and now they list over 10,000 open access journals.”

The push toward open access is not only growing in sheer numbers of journals but also in an increasingly confrontational strategy that academics leverage against large publishers. “At the moment, the Netherlands, the whole country, has said to Elsevier that we want all of our researchers to be able to publish open access in your journals at the same rates we would pay for a subscription last year and if you can’t do that we’re going to cancel every one of your journals, for all of our universities nationwide,” says Eve. “They have a few days left to resolve this, and it looks like they are going to cancel all the Elsevier journals.”

Rooryck found his recent very public decision to step down and move his Elsevier journal Linga to open access met with complete support from the other six editors and 31 editorial board members. “The process went very easily. We were all aware of the pricing and Elsevier’s practices and within a week everyone agreed to resign,” says Rooryck. Eve’s platform, the Open Library of Humanities, will now house the new open access iteration of Lingua, which will be called Glossa. Eve says, “Right away it is 50% cheaper to run it through us then when it was with Elsevier. So anybody subscribing to it already sees 50% more revenue.”

Rooryck finds the move toward broad open access a natural progression and says, “The knowledge we produce as academics and scientists should be publicly available in the same way we have a company that delivers water to our faucets and electricity to our home. These are things we have a right to. Public knowledge and education is a human right and it should not come with a profit tag of 35%.”

Although it appears open access has the ability to simultaneously diffuse academic knowledge to a larger body of readers and cut costs significantly, many feel that the for profit academic publishers are still situated to continue into the near future. Joseph says, “I think the play for most smart commercial publishers is to try to preserve the current environment for as long as they can: delay the policy changes, delay the culture changes and to be working on things like tools and services applying to aggregation of data, where they are then embedding themselves more deeply in the workflow of researchers and becoming essential to researchers in a different way.”

“If you are no longer essential to researchers in the, ‘you have to publish in my journal in order to get tenure and promotion’ what do they replace that with? I think the smart publishing companies like Elsevier, like Springer, who are very smart in that regard, have been thinking about where they can go to be playing a role of continuing to be seen as essential by the research community once they are no longer playing the role of providing assessment,” says Joseph.

Onward and Upward

“In the US Congress we have been finally making progress with the Fair Access to Science and Technology Research (FASTR) bill. It moved through the committee it was referred to in the Senate and is poised to move out of the Senate and potentially be considered by the House and hopefully pass. Ten years ago, I would have said we didn’t have a chance to do a stand-alone bill,” says Joseph.

Perhaps the recent congressional support Joseph refers to is one more verifying measure that the majority of articles will be moving toward an open and accessible framework. Many in the academic community hope that this government support signals the reprioritization of a research framework and the switching of the guard. And while the prior is extremely important, others in the academic community are hoping to grow “basement markets” from the ground up.

The Center for Open Science, which provides seed funds to startups in the academic scientific research space, is led by Nosek and focuses on aligning scientific values to scientific practices. “The open science framework is just a means at connecting all the research services that researchers use across the entire research life cycle,” says Nosek.

Nosek is optimistic about the evolution of technology in open science and says, “There are a lot of startups going at different parts of the research life cycle. Whether it is publication and what a publication means, or looking at full articles and whether you can make articles convey information in smaller bite size pieces.” Nosek tells me that there are so many solutions happening in research right now and mentions it is hard to judge what the true solutions will look like. “I sometimes think some of the ideas haven’t a chance, but what do I know? I could be completely wrong about it. And that is the whole point — do some experimentation and try out ideas. And the fact is there are a lot of people who see what the problems are and have a unique sense of a potential solution — it is a very lively time to try out different answers.”

Time will tell if open access will be the needed disruption to allow the academic environment to right itself or if a new market emerges from startup incubators like the Center for Open Science. Regardless of how the future vision is realized, most in the academic community hope that the new iteration of scholarly articles and publishing will do more good toward humankind than that of a hefty profit margin.

logo

Academic Scattering

Author: Katie Mack
Original: Research Whisperer


A couple of years ago, I was gathering my things after a seminar at a top physics research institution when I overheard two of the senior professors discussing a candidate for a senior lectureship.

Professor A was asking Professor B if the candidate had a partner, which might make him less able to move internationally.

Prof B replied, happily: “No, he has no family. He’s perfect!”

I doubt any selection committee would admit on-record to thinking a family-free candidate is “perfect”. Nonetheless, the traditional academic career structure is built around an assumption of mobility that is hard to maintain with any kind of relationships or dependents. I’m still trying to figure out if I can manage to keep a pet.

Right now I live in Australia, working as a postdoc in Melbourne. My first postdoc was in England. Before that I was in grad school in New Jersey, and I was an undergrad in my native California. Halfway through grad school I studied for a year in England. I’ve done two- or three-month stints in Japan, Germany, Australia and the UK. Each of these moves or visits has been, while not strictly required, extremely helpful for my career. And in a field where competition for jobs is so fierce, if you want any hope of landing that coveted permanent academic job, how many of these “helpful” moves can you really consider optional? If mobility is such an advantage, how does having a family or a partner affect your chances?

A couple of months ago, Slate published an article with the headline, “Rule Number One for Female Academics: Don’t Have a Baby.” The point of the article wasn’t actually to discourage women in academia from having children (though backlash from the community may have contributed to the change in title to the somewhat vague, “In the Ivory Tower, Men Only”). The article provided statistics and anecdotes to illustrate how having children, or being suspected of the intent to have children, could harm a woman’s progress in academia – from the necessary pause in research output, to the unconscious or explicit biases that act against “working mothers” but have no similar effect on “working fathers”. Personally, I found the piece deeply disheartening, but my dismay was of a somewhat detached variety. In order to worry about the effects of having children, one has to be in a position where that seems like even a remote possibility. As a single woman with a short-term contract and no idea which hemisphere I’ll be in two years from now, children are not exactly at the forefront of my mind. At the moment, I spend a lot more time thinking about the two-body problem.

In this context, the “two-body problem” is the problem of maintaining a committed relationship between two individuals who are trying to have careers in academia. When the two-body problem proves unsolvable, it’s sometimes called “academic scattering”. It is by no means unique to academia, but the international nature of the field, the frequency of short-term (1-3 year) contracts, and the low wages compared to other similarly intense career paths make it especially bad for academics. In the sciences, the gender disparity adds a further complication for female academics: when women make up a small percentage of the discipline, they are much more likely to be partnered with other academics.

Of course, solving the two-body problem is not impossible. I have many colleagues who have done it, either through spousal hires, fortuitous job opportunities, extended long-distance relationships, or various degrees of compromise. It takes sacrifice, luck, and, often, institutional support. But couples just beginning a relationship while building two academic careers might find the odds stacked against them. Even ignoring for a moment the fact that a no-compromise work-obsessed lifestyle is still considered a virtue in many institutions, academic careers are structurally best suited to people with no relationships or dependents, who travel light and have their passports at the ready.

It varies by field, but for physics and astronomy, a “typical” tenure-track career path looks something like this: 4-6 years in grad school, a postdoctoral fellowship for 1-3 years, then usually another (and maybe another), all followed by a tenure-track or permanent job, which may or may not be the job you end up in for the long-term. There’s no guarantee all these steps will be in the same country – very often they are not. For me, it’s been an international move every time so far, and it’s very possible the next one will be, too. When I took up my first postdoc, I left my country of origin, most of my worldly possessions, all my friends and family, and a committed relationship, to start all over in England. When I took up my second postdoc, I left my newly built life in England and another committed relationship to start all over yet again on the other side of the world. I’ve moved internationally several times chasing the prospect of permanent academic employment. I have yet to convince anyone to come with me.

I’m not trying to convince anyone that avoiding academia or refusing to move around the world is the key to solving all relationship problems. Anyone can be unlucky in love, even if they stay in the same city their entire lives. But academic shuffling is particularly hostile to romance. The short-term contracts mean that when you arrive in a new country, if you’re interested in finding a long-term partner, you have something like two years to identify and convince a person you’ve just met to agree to follow you wherever you might end up in the world, and you won’t be able to tell them where that will be. If you happen to have different citizenships (which is likely), you have to take into account immigration issues as well – your partner may not be able to follow you without a spousal visa, which can mean a rather hasty life-long commitment, or, depending on the marriage laws of the country in question, a total impossibility. I had a friend in grad school who, at the end of her PhD, faced a choice between living with her wife in Canada, and becoming a tenure-track professor at one of the most prestigious research universities in the USA.

The timing doesn’t help, either. The postdoc stage, when you’re doing your best impersonation of a human pinball, usually comes about in your late 20s or early 30s. It’s a time when it seems like all your non-academic friends are buying houses, getting married, having babies, and generally living what looks like a regular grown-up life. Meanwhile, chances are you’re residing in a single room in a short-term rental, wondering which country you’ll be living in next year. If you’re a woman, you might be keeping an eye on the latest research on fertility in older mothers, and mentally calculating how long you actually need to know someone before deciding to reproduce with them, because by the time you’re in one place long enough to think about settling down you’ll be, at best, pushing 40.

There are lots of ways to make it all work out, of course. You could refuse to date other academics, and instead make sure you’re spending enough time on hobbies outside of the university to attract someone’s interest, while making sure you have a REALLY good pitch about the joy of imminent mystery relocation. You could date another academic, and resign yourself to a relationship that will probably be long-distance for far longer than it was ever face-to-face, with no guaranteed reunion in sight. For this option, make sure that you have lots of disposable income for plane tickets and that neither of you is committed to spending too much time inside a lab. You could swear off serious dating altogether until you’re getting close to landing a permanent job, then negotiate with your future employer for a spousal hire, with the necessary career compromise that will be required of one or both of you to be at that particular institution.

Or you could just wait till you’ve scored a permanent faculty job somewhere, probably in your mid-to-late 30s, and (if you’re a woman) hope that you meet someone soon enough that starting a family is still an option. (As a side note, my late-thirties single straight female friends tell me that men who want babies won’t date women over 35. Obviously this is an unfair and unscientific generalization, but the point is that there are societal pressures that women face when they choose to put off the prospect of families until they have permanent jobs.) If you choose this option, you might also want to keep in mind that a tenure-track job isn’t necessarily permanent, and having a child before having tenure is one of those options that the aforementioned article had a few things to say about.

Or you could decide to prioritize where you want to be (or who you want to be with), and, more likely than not, end up severely limiting your career progress and/or leaving academia altogether. If one or the other partner does have to make a big career sacrifice, gender norms will suggest that, if you’re a woman, the one to make the sacrifice really ought to be you.

As for me, I confess I haven’t figured it out. I have two years left on my contract in Australia and no idea whatsoever which country I’ll end up in next. I’m applying broadly, and there’s no guarantee I’ll have a choice about location if I want to stay on the path toward becoming tenure-track faculty at a major research institution. When it’s not unusual for a single postdoc job to have 300 applicants, and faculty jobs are even more selective, getting even one offer is considered a huge win.

I don’t know if there’s a solution. Having a pool of early-career researchers who move frequently to different institutions unquestionably advances research and keeps the ideas flowing. It is also usually great for the development of postdocs’ research abilities, exposing them to new ideas and work styles. But the prospect of a nearly decade-long period of lifestyle limbo between graduate studies and the start of the tenure track is, understandably, a significant discouragement to many fine researchers who might otherwise bring their unique insights to the field. And, statistically, more of these lost researchers are likely to be women. It may not be the dominant force keeping women out of science or academia, and it may not affect all women, but any slight statistical skew that disadvantages women more than men contributes to the inequality we see. And that makes academia a little bit more lonely for everyone.

logo_square