On #PeerRevWk16: An Entirely Cynical Perspective

N. C. Hall  /  12/10/2016


#PeerRevWk16 is an annual effort by academic publishers to bolster flagging peer review participation, quality, and speed through explicit statements of thanks and recognition.

Although this initiative could be viewed as a face-valid effort by a public service industry charged by governments and post-secondary institutions with the sacred, inestimable responsibility of research dissemination, there are major ongoing issues underlying academics’ reluctance concerning peer reviews that this initiative does not discuss. From huge publisher profits afforded by gouging public institutions and not meaningfully compensating academics to unjustifiably high open access fees and peer review patents to stifle competition, there are serious systemic problems underlying the peer review process that this hashtag effort does little to address.

Basically, I started to feel uncomfortable seeing publishers attempt to dominate a hashtag ostensibly “for” academics with tweets containing marketing-department infographics on what academics want, promoting a new reviewer ratings system, or sharing “how-to” guides to cost-effectively improve the quality/speed of free academic labour. In response, it seemed important to balance this profitable status quo narrative by highlighting the uncomfortable realities of the peer review process for academics. I am by no means an expert on higher education policy/ethics/economics, I just wanted to share information and balance the discussion about how to promote research quality by better supporting those who do it.

It all started a few weeks ago when I first noticed tweets from academic publishers pop up in my timeline underscoring the importance and novelty of thanking peer reviewers as well as quantifying/ranking peer review efforts:

In typical fashion, I responded with flippant sarcastic commentary, thinking it to be an obviously transparent (and hopefully temporary) publisher effort to pacify volunteer reviewers with a pat on the back and self-relevant data:

But this weird gratitude parade only seemed to ramping up and it got me thinking more seriously about the motivation behind these reviewer appreciation efforts:

As a good academic, I supplemented these devastating hot takes with references to external sources outlining the growing dissent concerning the publication process:

With such an eviscerating response to this uncomfortable wave of public publisher affection, I thought my job was done. However, I soon realized there was an actual hashtag for this initiative – #PeerRevWk16 – and an entire week to come of publisher efforts to spam Twitter with pre-scheduled, strategic gratitude PR aimed at thanking academics by educating them as to their peer review value and responsibilities.

Some #PeerRevWk16 publisher tweets hoped to inform researchers of the importance of peer reviews as the cornerstone of scientific inquiry, as if they were somehow not addressing individuals who by definition should be not only intimately familiar with the scientific process but have based their research careers largely on this premise:

Other tweets expressed heartfelt thanks to reviewers for their time and effort through mass cut-and-paste “publishers are people too” gestures garnering remarkably few RTs or replies:

Publisher spam also included regularly scheduled marketing-office infographic blasts educating academics about why they do (read “should do”) peer reviews, with most results ironically showing academics to have already decided on better ways to spend their time:

And then there were the tweets consistently promoting the new reviewer recognition system “Publons”; a publisher-owned effort to bolster peer reviewer commitment by tracking, quantifying, and ranking peer reviewers:

But perhaps the most condescending #PeerRevWk16 tweets were those gently informing academics as to how they could better perform their free publisher labour:

So I admittedly got a bit snarky:

And being on sabbatical, words were soon diverted from manuscript revisions to countering this increasingly awkward, oblivious, and patronizing publisher narrative implying that problematic peer review disengagement could be remedied not by meaningful compensation or real talk about peer review costs, but by a Twitter campaign aimed at educating, flattering, and shaming academics. Again, I’m not an expert on the academic publishing industry, but it seemed important to share some thoughts on issues that were clearly being avoided such as:

1.  The peer review burden on vulnerable academics:

2.  The ethics of peer review compensation:

3.  In-store credit as review compensation:

4.  Financial compensation for peer reviews:

5.  The exclusion of industry expertise:

6.  Peer review sampling bias:

7.  The “gamification” of peer review:

8.  My personal review perspective:

9.  Public perception of publisher appreciation efforts:

So while the #PeerRevWk16 initiative does on the surface present as an effort to simply thank and support peer reviewers, a quick consideration of the academic publishing landscape suggests that it may also represent an effort to whitewash growing public discontent over a massively profitable industry that does shamefully little to show respect for the free academic labour on which it relies:

So for good measure, I doubled down with @AcademicsSay to better punctuate the #PeerRevWk16 publisher noise:

Even Nature got in on the fun:

And despite publisher-provided highlight reels of #PeerRevWk16 in which most of the above is effectively excluded, the narrative that resonated most with academics was obvious:

As to where to go from here, there were a few thoughts:

Maybe it’s just me, but this hashtag effort at best seems intended to distract from publisher problems or promote new publisher products. At worst, it seems a fundamentally misguided attempt to sustain profits by increasing peer review engagement among (a) inexperienced, less expert academics not yet familiar with the scientific process, (b) early career researchers trying any way they can to demonstrate a willingness to sacrifice their time and energy to potential employers, or (c) already overburdened academics disillusioned with the publication process who need and will take the self-esteem boost despite its patronizing tone.

Is a thank you from publishers for peer reviewing appreciated? Perhaps, but that’s not why we do it. And as a transparent attempt to placate a base increasingly dissatisfied with publishers profiting from their good will, institutional/disciplinary pressure, and passion for science, the #PeerRevWk16 effort kinda looks like using the “tip” section of a bill to provide actual tips on how to serve publishers better:

Of course, I might be entirely off-base in interpreting #PeerRevWk16 as anything other than a face-valid attempt to show some much-needed appreciation to hardworking volunteers. But as a leading authority on pandering to academics on Twitter, I can safely say that academic publisher trolling could use some work.

logo_square

What I Learned About Writing by Not

Author: Rebecca A. Adelman
Original: www.rebeccaaadelman.com


All is not lost.  What I have lacked in tangible productivity over my long season of writer’s block (which seems finally to be limping its way to a close), I have gained in new understandings of the intricacies of my writing process and the fussy mechanics of getting words on the page.

When you aren’t getting words on the page, it’s crazy annoying (at best) to hear about people that are.  And it’s similarly unpleasant to receive unsolicited suggestions about how to get yourself unstuck.  As if it was simply a matter of will or ergonomics or mental hygiene.  But if it was that easy, anyone could do it.  Producing good work, and doing it well, takes more than that.  So here are a few things I figured out about being productive when I was struggling to produce anything at all.  It’s an open letter, of sorts, to my writerly self – the “I” is me, and so is the “you.”  But the “you” can also be, you know, you, if you are reading this and wanting to reconsider your writing praxis.

Become attuned to your limits.
It’s hard to tune out the constant drone of academic meta-commentary about how much (or, from the occasional maverick, how little) we work.  And it helps to know that most of those aggrandizing self-reports are bullshit.  But even still, focusing too much on what other people are doing, or not, just leaves me insecure, or anxious, or envious.  So spend less time worrying about what other people are doing and focus on your own patterns. Then figure out how you work, and be honest about whether all the hours you spend “working” are actually that.  For example, I’ve figured out that I’m neither efficient nor terribly lucid after dinner, and that even when I go back to work late in the evening, I’m not getting much done besides maybe assuaging my guilt about not working enough.

Diminishing returns are a thing.  So consider whether you might be better served by reinvesting those mediocre or largely symbolic work hours elsewhere.

Figure out how you want the experience of writing to feel.  
Turns out, there are no extra points for suffering.  Or if they are, they circulate in an economy that is wildly unrewarding.  Like the counters where you redeem your tickets at arcades: a small fortune in tokens and hours spent playing Skeeball leave you with an armload of little cardboard rectangles and the teenager in charge of the whole operation barely acknowledges you when you come to select your prize and it ends up that all you can afford is a pencil case.  Anyway.

Few of us have the luxury, presumably, to only write when it feels good.  Deadlines, tenure, promotion, &c.  But unless you produce your best work in the throes of abject misery, experiment with the novel practice of setting your writing aside when writing feels terrible.  We all have different thresholds for ‘terrible,’ and that terrible feeling might be mental or physical, but when you encounter that threshold, I think it’s smart to heed it. Admittedly, I am still relatively new to the routine of being a peer-reviewer, but I have not yet encountered a reviewer questionnaire instructing me to give special consideration to a project if I think the author cried a lot (A LOT) while they composed it.  And if there are people who will give you extra credit for your anguish, think carefully about whether you want to play by that set of rules.

Spend some time thinking about how it feels when you are doing your best work.  Maybe you feel focused, or excited, or peaceful, or maybe you’re so in it that you don’t feel anything at all.  Take advantage of those times, figure out how to increase their frequency if possible, develop strategies for doing good-enough work in circumstances that only approximate them.  And otherwise: leave it alone.

Work at a pace that’s sustainable.
Pretty much every academic I know, including me, is overcommitted.  There are lots of reasons for this, both individual and structural.  Obviously, everybody will define “overcommitted” in their own ways, and experience being overcommitted idiosyncratically.  I’ll need to figure out, eventually, why I have a tendency to hoard projects, but here’s what I know for now: I tend to overestimate the amount of time that I have before a deadline, while underestimating how much work I will want to put into a given project.  Part of me also imagines that the asteroid will surely hit between now and whatever deadline so it won’t actually matter.

I can manage the consequences of my over- and underestimating (as well as the general paucity of asteroids) fairly well under normal circumstances.  But when shit, inevitably happens, that mismatch becomes acutely untenable.

So: try to plan out your projects and commitments, as best as you are able, so that they align with how busy you want to be, and when, while also maintaining an overall mode of existence that is tolerable.  (Parenthetically, I think academics ought to aspire to existences that are more than tolerable, and break the habit of postponing tolerability until the summer.)  Not all of this is in your control, of course, so another part of writing and working well is, I think, accepting that those plans won’t always pan out.  And leave a margin for catastrophes, great and small.  If your whole writing scheme is contingent on you never getting a flat tire / your kid never getting sick / you never getting called for jury duty / no one you love ever needing you or dying, it probably isn’t going to work for you long-term.

Consider what it’s worth to you.
Because we are all, alas, constrained by the laws of time and space, doing one thing generally means not doing another (or half-doing two things at once).  Try to be cognizant of the trade-offs your writing affords and requires of you.  Be honest about whether the potential rewards actually appeal to you, and your values.  And then consider the costs, and whether they’re acceptable.  With a few exceptions, I am generally fine to sacrifice binge-watching for writing.  And sometimes I feel very okay opting out of being social so I can stay in and work.  But on the other hand, it’s almost never worth it to me – though it used to be – to trade work for sleep, or healthy food, or exercise.  Maybe your non-negotiable stuff is different.  The point is to figure out what that non-negotiable stuff is, and protect it … otherwise work will eat it all.

Detach from the outcome.
Beyond doing your best to make your ideas intelligible and your style engaging, you can’t control how people will respond to your writing.  Consider your audience, but don’t obsess about them, and learn the difference between wanting to connect with your readers and needing to charm and trap them into your ways of seeing and thinking.  Efforts to engineer reader reactions almost never generate better writing, and are much more likely to result in arguments that overreach or result to pedantry, while the fixation with impressing your audiences will ultimately leave you stultified and unable to say much of anything at all.  Good ideas are much easier to come by than magic words.

Look, and move, forward. 
You will have seasons when you are more productive, seasons when you are less productive, and seasons when you are scarcely functional.  Hopefully, over the course of your writing life, these will balance out into an overall sense of accomplishment, with a body of work that bears it out.  When you are more productive, spend some time figuring out what enables you to work at that level, but don’t make yourself crazy trying to recreate it every time you encounter a slump.  Chances are, it’s mostly a matter of circumstance: a legitimate manifestation of your brilliance, sure, but maybe also just good luck. Conversely, the seasons when you are less productive are also likely to be those in which your luck is worse than usual, and not a  final revelation of your incompetence.

Capitalism tells us that time is modular, that any hour has potentially the same value as any other hour, and hence that missed hours can be replaced.  Nope.  If there is something big that keeps you from your work for a season, you won’t (sorry) be able to get those hours back.  And especially if that something big is also something massively unpleasant, you probably won’t be able to stop feeling lousy about those lost hours, anxious or mournful about the work you could be doing, and resentful of the people around you who happen to be enjoying one of those good-luck seasons of magical writing.  In those moments, all you can do is muddle through: do what you can with your radically reduced resources, plead for deadline clemency if you need it, and accept – your overwhelming fatigue may help lubricate this process – that you probably won’t be producing your very best work at this particular godawful juncture.  And don’t compound the insult by blaming yourself for those lost hours, those words left unwritten.  For my part, now that I’m halfway (give or take) back in the saddle after a pretty unrelentingly miserably eighteen months, it’s a daily struggle not to take the losses of that period out on myself.  It takes a lot of mental discipline to focus on what you can do, not on what you didn’t because you couldn’t.

*    *    *    *    *

So that’s a little bit of what I know now that I didn’t know before.  It strikes me as odd that academics, generally so good at questioning why things are the way they are, rarely bring their skeptical sensibilities to the task of questioning their own work habits or the expectations they have internalized.  And for those who are satisfied with their circumstances, there may be no need for this kind of querying.  But I get the impression (or maybe I just run with an exceptionally grumpy crowd) that lots of us are less than satisfied.  Of course, many of the reasons for that are structural, and so insuperable by these tiny little hacks.  But despite this, or maybe because of it, minor adjustments made in the service of your own comfort are meaningful, worth it, and necessary.

logo_square

Could Parental Leave Actually be Good for my Academic Career?

Author: David Kent
Original: University Affairs | The Black Hole


Last autumn, I started my research lab at the University of Cambridge’s Stem Cell Institute, but this coming summer I’m doing something completely different – I’m taking parental leave with my first child. I must admit that at least some inspiration came from my brother, who took a term off with his second child and said it was one of the best decisions he’d ever made.

It’s been a tough journey to get a group leader position – 11 years of intense research-focused time, most of which were spent in a complete black hole of uncertainty with respect to my future career. And now, I won’t be in the lab for 14 weeks – we’ll see how it all works out.

Reaction to my decision amongst non-academic family and friends was pretty much universally positive, but reaction from academic colleagues was highly variable – a substantial number of whom think I’m absolutely crazy to take off so much time within the first year of my research lab’s existence. I wasn’t too surprised by this, having emerged from the North American system where parental leave is much less generous than in Europe. What I didn’t expect were the other reactions …

In November, I was at a national cancer conference and at one of the evening receptions I spoke with a female scientist from another U.K. university about women in science. Over the course of the discussion, I mentioned that my partner and I would be taking advantage of the U.K.’s new “Shared Parental Leave” policy, with my partner taking 8.5 months of leave and me taking 3.5 months. She said she was shocked and surprised that a brand new group leader would take the time off, but also said “good for you.”

The next evening is when things really hit home though. After the conference dinner I was on the dance floor and a complete stranger came up to me and asked, “Are you David Kent?” I assumed she had seen my presentation earlier in the day until she continued, “the David Kent who is taking parental leave as a new group leader? I just wanted to say thank you.” We chatted a little and it was as simple as this: a male group leader taking parental leave was just not that common, especially not a 3.5-month block of time. The professor from the other night had clearly gone off and told her colleagues and word had spread.

Here I was being showered with praise for taking 3.5 months off work and feeling pretty good about my decision until I did a quick comparison to my partner’s situation, also an early career scientist. Not only would she be taking nearly three times the amount of leave, but she’s also been carrying a baby around for eight months whilst undertaking world-class research. Is there a small fan club of approving academics lined up to congratulate her on the brave decision to spend time with her child? Not that I’ve seen.

So, in effect, my taking a short block of parental leave has boosted my profile in the eyes of some academics and her taking a longer block will put her in the challenging position that so many young female academics find themselves in: trying to play catch-up and pretend that children haven’t impacted their careers (many do not acknowledge children on CVs, job applications, etc., for fear of being viewed unfavourably). The science community needs to embrace rather than shun such individuals.

Overall, if universities want more women in science, then the way we handle babies and families needs to change – men need to be as “risky” to hire as women. But change does not come overnight and it does not come easy. As a start, more countries (and institutions) need to have “use it or lose it” policies, such as exists in Quebec – the father is given a block of time that the mother cannot use. Universities and individuals need to fight for this. Countries such as Sweden have seen incredible results from such policies and are amongst the world leaders in having women in senior positions. For science specifically, granting agencies need to behave like the European Research Council with respect to eligibility windows and like EMBO for postdoctoral fellowships – creating small allowances for young parents that make the journey just a little bit easier.

Or perhaps we should just force them all out of science – that seems to be the way things are currently set up and it makes me worry for our future science workforce.

logo_square

Scientists Have the Power to Change the Publishing System

Author: David Kent
Original: University Affairs | The Black Hole


Earlier this month I read an article by Julia Belluz that ripped into the scientific publishing system. The saddest, and truest, sentiment of the article can be summed up in the following quotation:

“Taxpayers fund a lot of the science that gets done, academics peer review it for free, and then journals charge users ludicrous sums of money to view the finished product.”

This is certainly not the first attack against the publishing process nor the first to encourage open-access publishing. In the remainder of her article, Ms. Belluz focuses on the role that governments can play in getting more scientific research freely and instantly available. In sum, she suggests that government funding agencies (e.g., the United States National Institutes of Health or the Canadian Institutes of Health Research) could refuse to give grants to those scientists who did not publish in open-access journals.

This is a laudable, and indeed it is the approach being taken bit by bit by funding agencies – the Wellcome Trust in the U.K. for example has a very robust open access policy that includes providing grant funding for the open-access charges. While this will certainly get more research out sooner and without charge, I believe it misses out on an important aspect of the power dynamic that plagues the scientific publishing process.

The fact is that journals with high impact factors wield enormous power because they hold the key to scientists’ careers – the field has become so obsessed with metrics that it is insufficient to be a good scientist with good ideas and the ability to perform good research. As things stand now, if you want research grants (and in most cases, this means if you want a job), then you need to publish a paper (or several!) with a big-name journal.

So what can scientists do? Well, it turns out scientists are involved in just about every aspect of the publishing power dynamic. First, one needs to understand what’s at stake. Scientists want big name papers for three main reasons:

  1. Grants
  2. Jobs
  3. Recognition

However, papers in big-name journals do not directly give you grants or jobs, nor are they the only way to be recognized as a good scientist. Other scientists make these decisions, but far too often their judgment is impacted by the glitz and glam of the big-name journals.

Jobs are often won by those doing research that has good institutional fit – they bring a novel technology, a new way of looking at things, or a broad network of excellent former colleagues – but jobs are often lost because the candidate is “not fundable.” The latter is more often than not decided based on where they have published and how a grants panel will view them. So it basically comes down to who can get grants. And who generally decides funding outcomes? Scientists.

I wonder how many grant panels have heard the phrase “the project looks good, but the candidate has only ever published in mid-range journals.” Indeed, I know several scientists who rank applications based on a candidate’s publication record irrespective of how good or bad the project is or how well-resourced the working environment is.

One suggestion: Ban the CV from the grant review process. Rank the projects based on the ideas and ability to carry out the research rather than whether someone has published in Nature, Cell or Science. This could in turn remove the pressure to publish in big journals. I’ve often wondered how much of this could actually be drilled down to sheer laziness on the part of scientists perusing the literature and reviewing grants – “Which journals should I scan for recent papers? Just the big ones surely…” or “This candidate has published in Nature already, they’ll probably do it again, no need to read the proposal too closely.”

Of course I generalize and there are many crusaders out there (Michael Eisen, Randy Sheckman, Fiona Watt, etc.) pushing to change things and I mean them no offence. I just wish that more people could feel safe enough to follow their lead. In my own journey to start up a lab, I am under enormous pressure to publish in a big journal (i.e., my open-access PLoS Biology paper doesn’t make the grade and open source juggernaut e-Life has yet to achieve high-level status despite its many philosophical backers).

So, in sum, scientists in positions of power (peer reviewers, institute directors, funding panel chairs) are the real targets for change. Assess based on research merit, not journal label. Let’s make journals tools of communication, not power brokers of scientific careers.

logo_square

Giving Up On Academic Stardom

Author: Eric Grollman
Original: Conditionally Accepted


I have bought into the ego-driven status game in academia. Hard. I find myself sometimes wondering more about opportunities to advance my reputation, status, name, and scholarship than about creating new knowledge and empowering disadvantaged communities. Decision-making in my research often entails asking what will yield the most publications, in the highest status journals with the quickest turnaround in peer-review. I often compare my CV to others’, wondering how to achieve what they have that I have not, and feeling smug about achieving things that haven’t. Rarely do I ask how to become a better researcher, but often ask how to become a more popular researcher.

I have drunk the Kool-Aid, and it is making me sick. Literally. The obsession with becoming an academic rockstar fuels my anxiety. I fixate on what is next, ignore the present, and do a horrible job of celebrating past achievements and victories. I struggle to accept “acceptable.” I feel compelled to exceed expectations; I take pride when I do. “Wow, only six years in grad school?” “Two publications in your first year on the tenure track?! And, you’re at a liberal arts college?”

When did I become this way? Sure, academia is not totally to blame. My parents expected me to surpass them in education (they have master’s degrees!). I also suffer, as many gay men do, with the desire to excel to gain family approval, which is partially lost upon coming out. Excelling in college, rather than becoming an HIV-positive drug addict, helped my parents to accept my queer identity. In general, I compensate professionally and socially for my publicly known sexual orientation. It is hard to unlearn the fear one will not be loved or accepted, especially when homophobes remind you that fear is a matter of survival.

Oh, but academia. You turned this achievement-oriented boy into an anxious wreck of a man. It is not simply a bonus to be an academic rockstar of sorts. My job security actually depends on it. And, it was necessary to be exceptional to even get this job. And, it matters in other ways that indirectly affect my job security, and my status in general. You can forget being elected into leadership positions in your discipline if no one knows you. “Who?” eyes say as they read your name tag at conferences before averting their gaze to avoid interacting. I have learned from my critics that one must be an established scholar before you can advocate for change in academia.

The Consequences Of Striving For Academic Stardom

I am giving up on my dream to become the Lady Gaga of sociology. I have to do so for my health. I have to stop comparing myself to other scholars because so many things vary, making it nearly impossible to find a truly fair comparison. Of course, I will never become the publication powerhouse of an Ivy League man professor whose wife is a homemaker. Even with that example, I simply do not know enough about another person’s life, goals, and values to make a comparison. I do not want others to compare themselves to me because my level of productivity also entails Generalized Anxiety Disorder. I am not a good model, either!

Dreams of academic stardom prevent me from appreciating my present circumstances, which were not handed to me. Sadly, voices, which sound awfully similar to my dissertation committees’, have repeatedly asked, “are you surrreeee you don’t want to be at an R1?” I have zero interest in leaving, and negative interest (if that is possible) in enduring the job market again. But, I fear that, as I was warned, I will become professionally irrelevant; and, this has made it difficult to fully appreciate where I am. I have acknowledged the reality that no place will be perfect for an outspoken gay Black intellectual activist. But, I have found a great place that holds promise for even better.

Beyond my health, the lure of academic stardom detracts from what is most important to me: making a difference in the world. Impact factors, citation rates, and the number of publications that I amass distract from impact in the world and accessibility. It is incredibly selfish, or at least self-serving, to focus more energy on advancing my own career rather than advancing my own communities.

Obsession with academic rockstardom forced me to view colleagues in my field as competition. My goal is to demonstrate what I do is better than them in my research. In doing so, I fail to see how we can collaborate directly on projects, or at least as a chorus of voices on a particular social problem. Yet, in reality, no individual’s work can make a difference alone. I also fail to appreciate the great things my colleagues accomplish when I view it only through jealous eyes.

When I die, I do not want one of my regrets to be that I worked too hard, or did not live authentically, or did not prioritize my health and happiness as much as I did my job.  Ok, end of rant.

logo_square

The Lie Guy

Author:
Original: Chronicle of Higher Education


You’d think I’d get used to being called a liar. After all, I’ve written a candid, semiautobiographical novel about being a scam artist, been interviewed in the media about my former life of lying, cheating, and drinking, even edited a prominent philosophical collection on deception. But when a colleague recently ridiculed me about being known as a liar, my feelings were hurt. I have a new life. I’ve been clean and sober and “rigorously honest” (as we say in AA) for two years. Still, to tell you the truth (honestly!), I earned my reputation fair and square.

In the Internet age, a sordid past is a matter of very public rec­ord—for that matter, of public exaggeration—and if you write fiction and memoir about your worst days, as I did (and continue to do), even your students will take the time to read the racy parts (or at least excerpts in online interviews of the racy parts, or YouTube interviews about the racy parts).

God bless and keep tenure—I’d probably hesitate to be frank in this essay without it—although, to be fair to my institution, the ignominious stories about me and my novel were out before my committee granted me tenure. “It takes an odd person to work on lying,” my late mentor (and friend and co-author), the philosopher Robert C. Solomon, once told me, himself having written one or two of the best papers on the subject.

When I was 26 years old, in 1993, I dropped out of grad school at the University of Texas at Austin—I was on a fellowship, staring day after day at my stalled dissertation among stacks of books and papers from the Kierkegaard Archive in the Royal Library in Copenhagen—to go into the luxury-jewelry business. I decided to burn all of my bridges. I didn’t fill out any forms. I didn’t have the ordinary courtesy even to contact my two dissertation directors, Solomon and Louis H. Mackey. I just vanished.

I told myself that it was a conscious strategy, to prevent myself from going back, but I also knew the truth: that I was simply too ashamed to tell them that I had gone into business for the money. Like many of our deceptions, mine was motivated by cowardice: “Tell the people what they want to hear,” or, if you can’t do that, simply don’t tell them anything at all.

A few years later, my next-door neighbor (my wife and I had just moved in) caught me in the driveway and asked, “Hey, Clancy. Did you go to grad school at the University of Texas?”

“I did, that’s right.” I was already uncomfortable. I opened the door of my convertible. The Texas summer sun frowned cruelly down on me.

“I’m an editor of Bob Solomon’s. He told me to say hello.”

Busted. This was Solomon’s way of calling me on my b.s. It was his personal and philosophical motto, adopted from Sartre: “No excuses!” Take responsibility for your actions. Above all, avoid bad faith. Look at yourself in the mirror and accept—if possible, embrace—the person that you are.

But I was on my way to work, and Bob Solomon, at that point in my life, was the least of my problems. I had him stored neatly in the mental safety-deposit box of “people I had not lied to but had betrayed in a related way.”

The jewelry business—like many other businesses, especially those that depend on selling—lends itself to lies. It’s hard to make money selling used Rolexes as what they are, but if you clean one up and make it look new, suddenly there’s a little profit in the deal. Grading diamonds is a subjective business, and the better a diamond looks to you when you’re grading it, the more money it’s worth—as long as you can convince your customer that it’s the grade you’re selling it as. Here’s an easy, effective way to do that: First lie to yourself about what grade the diamond is; then you can sincerely tell your customer “the truth” about what it’s worth.

As I would tell my salespeople: If you want to be an expert deceiver, master the art of self-deception. People will believe you when they see that you yourself are deeply convinced. It sounds difficult to do, but in fact it’s easy—we are already experts at lying to ourselves. We believe just what we want to believe. And the customer will help in this process, because she or he wants the diamond—where else can I get such a good deal on such a high-quality stone?—to be of a certain size and quality. At the same time, he or she does not want to pay the price that the actual diamond, were it what you claimed it to be, would cost. The transaction is a collaboration of lies and self-deceptions.

Here’s a quick lesson in selling. You never know when it might come in handy. When I went on the market as a Ph.D., I had six interviews and six fly-backs. That unnaturally high ratio existed not because I was smarter or more prepared than my competition. It was because I was outselling most of them.

Pretend you are selling a piece of jewelry: a useless thing, small, easily lost, that is also grossly expensive. I, your customer, wander into the store. Pretend to be polishing the showcases. Watch to see what is catching my eye. Stand back, let me prowl a bit. I will come back to a piece or two; something will draw me. You see the spark of allure. (All great selling is a form of seduction.) Now make your approach. Take a bracelet from the showcase that is near, but not too near, the piece I am interested in. Admire it; polish it with a gold cloth; comment quietly, appraisingly on it. You’re still ignoring me. Now, almost as though talking to yourself, take the piece I like from the showcase: “Now this is a piece of jewelry. I love this piece.” Suddenly you see me there. “Isn’t this a beautiful thing? The average person wouldn’t even notice this. But if you’re in the business, if you really know what to look for, a piece like this is why people wear fine jewelry. This is what a connoisseur looks for.” (If it’s a gold rope chain, a stainless-steel Rolex, or something else very common and mundane, you’ll have to finesse the line a bit, but you get the idea.)

From there it’s easy: Use the several kinds of lies Aristotle identified in Nicomachean Ethics: A good mixture of subtle flattery, understatement, humorous boastfulness, playful storytelling, and gentle irony will establish that “you’re one of us, and I’m one of you.” We are alike, we are friends, we can trust each other.

The problem is, once lying to your customer as a way of doing business becomes habitual, it reaches into other areas of your business, and then into your personal life. Soon the instrument of pleasing people becomes the goal of pleasing people. For example, who wouldn’t want to buy a high-quality one-carat diamond for just $3,000? (Such a diamond would cost $4,500 to $10,000, retail, depending on where you buy it.) But you can’t make a profit selling that diamond for $3,000—you can’t even buy one wholesale for that amount. Since the customer can’t tell the difference anyway, why not make your profit and please the customer by simply misrepresenting the merchandise? But that’s deceptive trade! There are laws against that! (There’s a body of federal law, in fact: the Uniform Deceptive Trade Practices Act. Texas awards triple damages plus attorney’s fees to the successful plaintiff.) Aren’t you worried about criminal—or at least civil—consequences? And how do you look at yourself in the mirror before you go to bed at night?

During my bleakest days in business, when I felt like taking a Zen monk’s vow of silence so that not a single lie would escape my lips, I often took a long lunch and drove to a campus—Southern Methodist University, Texas Christian University, the University of Texas at Arlington—to see the college kids outside reading books or holding hands or hurrying to class, and to reassure myself that there was a place where life made sense, where people were happy and thinking about something other than profit, where people still believed that truth mattered and were even in pursuit of it. (OK, perhaps I was a bit naïve about academic life.)

I was in the luxury-jewelry business for nearly seven years, and though I don’t believe in the existence of a soul, exactly, I came to understand what people mean when they say you are losing your soul. The lies I told in my business life migrated. Soon I was lying to my wife. The habit of telling people what they wanted to hear became the easiest way to navigate my way through any day. They don’t call it “the cold, hard truth” without reason: Flattering falsehoods are like a big, expensive comforter—as long as the comforter is never pulled off the bed.

It seemed that I could do what I wanted without ever suffering the consequences of my actions, as long as I created the appearance that people wanted to see. It took a lot of intellectual effort. I grew skinnier. I needed more and more cocaine to keep all my lies straight. And then, one morning, I realized that I had been standing in “the executive bathroom” (reserved for my partner and myself) at the marble sink before a large, gilt Venetian mirror every morning for days, with my Glock in my mouth (in the jewelry business, everyone has a handgun). I still remember the oily taste of that barrel. Before I confronted the fact that I was trying to kill myself, I had probably put that gun in my mouth, oh, I don’t know—20, 30 times. I said, “Enough.”

I called Bob Solomon. That was in May of 2000.

I was relieved when he didn’t answer his phone. I left a message: “I’m sorry, Dr. Solomon. I’d like to come back.” Words to that effect, but at much greater length. I think the beep cut me off.

When he called back, I was too frightened to pick up. I listened to his voice-mail message. He said, “Clancy, this is not a good time to make yourself difficult to get ahold of.”

I called again. He let me off easy. (He was perhaps the most generous person I’ve ever known.) I caught him up with the past six years of my life. He told me to call him Bob, not Dr. Solomon: “We’re past that.” Then he said, “So, why do you want to come back?”

“I want to finish what I started, Bob.”

“That’s a lousy reason. Try again.”

“I need to make a living that’s not in business. I hate being a businessman, Bob.”

“So be a lawyer. Be a doctor. You’ll make more money. It’s not easy to get a job as a professor these days, Clancy.”

“It’s the one thing I really enjoyed. Philosophy was the only thing that ever truly interested me. And I have some things I want to figure out.”

“Now you’re talking. Like what? What are you thinking about?”

“Lying. Or failure. I feel like I know a lot about both of them right now.”

(I was writing a long essay about suicide, which, come to think of it, might have been more to the point at the time. But I didn’t want to scare him off.)

A beat.

“Nobody wants to read about failure. It’s too depressing. But lying is interesting. Deception? Or self-deception? Or, I’m guessing, both?”

“Exactly. Both. How they work together.”

With the help of a couple of other professors who remembered me fondly, in the fall semester of 2000, Bob Solomon brought me back to the philosophy doctoral program at Austin, and I started work on a dissertation called “Nietzsche on Deception.” One of the other graduate students—Jessica Berry, now one of philosophy’s best young Nietzsche scholars—called me “the lie guy,” and the moniker stuck.

I went to work on deception not because I wanted to learn how to lie better—I had mastered the art, as far as I was concerned—but because I wanted to cure myself of being a liar. What had started out as a morally pernicious technique had become a character-defining vice. I had to save myself. I needed to understand the knots I had tied myself into before I could begin to untangle them. (It seems like an odd solution now. At the time, I thought I was too smart for therapy.)

It’s an old idea, of course: The Delphic injunction “Know thyself” is an epistemological duty with moral muscle, intended for a therapeutic purpose. Throughout the history of philosophy, until quite recently, it was thought that the practice of philosophy should have a powerful impact on the philosopher’s life—even, ideally, on the lives of others. So I studied deception and self-deception, how they worked together, why they are so common, what harms they might do, and when, in fact, they may be both useful and necessary. Think, for example, about the misrepresentation, evasion, and self-deception involved in falling in love. Who hasn’t asked, when falling in love, “But am I making all this up?” Erving Goffman would have agreed with the joke—I think we owe it to Chris Rock: “When you meet someone new, you aren’t meeting that person, you’re meeting his agent.”

I was lucky: I was awarded my Ph.D. in 2003, and I got a job. Being part of a university as a professor was very different from being a student, even a grad student. Suddenly you have power. In business—especially in retail—the customer has all the power. But students are nothing like customers, although they are starting to act more and more that way, I’ve noticed, and have eagerly adopted the motto “the customer is always right.” My fellow professors wore their power like a crown. They didn’t feel the need to pull a smile out of anyone.

I was still going from classroom to committee room trying to please everyone. I don’t think it harmed me or anyone else, particularly: It was simply unnecessary. As that sank in, I became disoriented. It reminded me of when I was in St. Petersburg, Russia, in the 1990s, trying to hire the world’s best (and most underpaid) jewelers. No one cared about your money. The concept hadn’t yet sunk its teeth into the post-Communist soul. Similarly, in academe, no one paid much attention to the capital—charm—I was accustomed to spending in my daily life.

In fact, charm could even be a hindrance. In my first year, I was asked by a senior colleague to be the research mentor to a philosopher who had been hired around the same time. After talking about my research, my colleague added, “You are mostly who you seem to be.” This from a man who prided himself on being only who he seemed to be—as though we are all only one person!—and as a way of letting me know that he had “seen through me,” that he “was not prey to my charms.” Also, no doubt he was gently letting me know that I didn’t have to pretend to be someone other than I was.

In my old life, everyone was always trying to be more charming than everyone else—even the gruffness of certain wholesalers was (everyone understood) only pretense, the pose of authenticity, the rough exterior that hid the honest, caring heart. To be charming was among the highest virtues.

But now the chair of a science department at my university—a person whom I like very much, and who is enormously charming—and other colleagues often seem suspicious of charm in anyone. Charm is what you expect from administrators, and they, we all know, are not to be trusted. Administrators are just glorified salespeople who can’t publish (so the story goes). A charming student is a dishonest student, an apple polisher.

If I was a bit rude to people, however, if I acted superior, if I had the right mix of intellectual distance and modest moral disdain, I was suddenly a member of the club. I had to be the opposite of eager to please. Other people must be eager to please me. And if they were, I should be suspicious of them. They should be subservient without being (obviously) obsequious. They can flatter, but never as a salesperson flatters; I want flattery only from my equals. This from people who were regularly checking RateMyProfessors.com to see how many hot peppers they’d earned. Or who fretted—or, still worse, pretended not to fret—about their teaching evaluations.

I got Bob Solomon on the phone again.

“Bob, the professor business is even sleazier than the jewelry business. At least in the jewelry business we were honest about being fake. Plus, when I go to conferences, I’ve never seen such pretentiousness. These are the most precious people I’ve ever met.”

“Come on, Clancy. Did you really think people were going to be any better in a university?”

“Um, kind of.” Of course I did. “And it’s not that they’re not better. They’re worse.”

“Well, you may have a point there.” (Bob was always very tough on the profession of being a professor.) “Focus on the students and your writing. The rest of it is b.s.” (That was a favorite expression of Bob’s, as it is of a former colleague of his at Princeton, Harry Frankfurt.)

“With the students, I still feel like I’m selling.” (I was very worried about this.)

“You are selling. That’s part of what it is to be a good teacher.” (Bob was in the university’s Academy of Distinguished Teachers and had won every teaching award in the book. He also made several series of tapes for the Teaching Company.) “To be a good teacher, you have to be part stand-up comic, part door-to-door salesman, part expert, part counselor. Do what feels natural. Be yourself. Are your students liking it? Is it working for you?”

“Yes.” They liked it all right, maybe a bit too much. “And I think they’re learning.”

“Then forget about the rest of it. Just have fun. That’s the best reason for doing it.”

Stendhal wrote: “With me it is a matter of almost instinctive belief that when any … man speaks, he lies—and most especially when he writes.” I still like to tell a good story. But doesn’t everybody who loves teaching? How else are you going to liven up the classroom when students’ eyes are always turning to their iPhones or laptops?

People often ask me now if I miss the jewelry business. My brother and I rode elephants in the mountains of northern Thailand to buy rubies from the miners. I flew to Hong Kong to buy a rope of gigantic black South Sea pearls—each nearly the size of your thumb—and a precious antique jade bracelet from a dying Chinese billionairess, and flew to Paris two days later to sell it to a customer. I walked through the winding, crowded streets of Jerusalem with my diamond wholesaler, talking about the two-state solution. I stayed at the Four Seasons, the Mandarin Oriental, or private mansions of friends. I lived shoulder-to-shoulder with celebrity clients, flew first class, had my suits custom-made, vacationed in Bali or wherever I wanted. More important—thinking of my life today—I didn’t worry about whether my daughters might have to take out student loans.

And the truth is, a lot of the time, that life was fun. The people were rich, noisy, outrageous. When I opened a new store, I felt like I’d created something special.

Would I go back? Do I miss it? No. Sometimes—I write this looking out my office window at the 100-year-old trees outside, their boughs barely lifting and falling in the autumn wind—I feel like a monk who has retreated from a world that was too much for him. “The greatest part of virtue lies in avoiding the opportunity for vice,” St. Augustine teaches us.

Maybe I’m persisting in a kind of self-deceptive naïveté that Bob wouldn’t have approved of, but you could say that my livelihood now depends on telling the truth. Back then I was arms-and-shoulders deep into life, and now at times I feel as though I am only skating on its mirrored surface. But I’d be afraid to go back. I feel peaceful now. It’s less work to be me, and to have me around. I don’t feel the need to lie. Most of the time.

 


Dr. Martin’s new book on deception in romantic relationships entitled “Love and Lies” is now available.

logo

Academic Scattering

Author: Katie Mack
Original: Research Whisperer


A couple of years ago, I was gathering my things after a seminar at a top physics research institution when I overheard two of the senior professors discussing a candidate for a senior lectureship.

Professor A was asking Professor B if the candidate had a partner, which might make him less able to move internationally.

Prof B replied, happily: “No, he has no family. He’s perfect!”

I doubt any selection committee would admit on-record to thinking a family-free candidate is “perfect”. Nonetheless, the traditional academic career structure is built around an assumption of mobility that is hard to maintain with any kind of relationships or dependents. I’m still trying to figure out if I can manage to keep a pet.

Right now I live in Australia, working as a postdoc in Melbourne. My first postdoc was in England. Before that I was in grad school in New Jersey, and I was an undergrad in my native California. Halfway through grad school I studied for a year in England. I’ve done two- or three-month stints in Japan, Germany, Australia and the UK. Each of these moves or visits has been, while not strictly required, extremely helpful for my career. And in a field where competition for jobs is so fierce, if you want any hope of landing that coveted permanent academic job, how many of these “helpful” moves can you really consider optional? If mobility is such an advantage, how does having a family or a partner affect your chances?

A couple of months ago, Slate published an article with the headline, “Rule Number One for Female Academics: Don’t Have a Baby.” The point of the article wasn’t actually to discourage women in academia from having children (though backlash from the community may have contributed to the change in title to the somewhat vague, “In the Ivory Tower, Men Only”). The article provided statistics and anecdotes to illustrate how having children, or being suspected of the intent to have children, could harm a woman’s progress in academia – from the necessary pause in research output, to the unconscious or explicit biases that act against “working mothers” but have no similar effect on “working fathers”. Personally, I found the piece deeply disheartening, but my dismay was of a somewhat detached variety. In order to worry about the effects of having children, one has to be in a position where that seems like even a remote possibility. As a single woman with a short-term contract and no idea which hemisphere I’ll be in two years from now, children are not exactly at the forefront of my mind. At the moment, I spend a lot more time thinking about the two-body problem.

In this context, the “two-body problem” is the problem of maintaining a committed relationship between two individuals who are trying to have careers in academia. When the two-body problem proves unsolvable, it’s sometimes called “academic scattering”. It is by no means unique to academia, but the international nature of the field, the frequency of short-term (1-3 year) contracts, and the low wages compared to other similarly intense career paths make it especially bad for academics. In the sciences, the gender disparity adds a further complication for female academics: when women make up a small percentage of the discipline, they are much more likely to be partnered with other academics.

Of course, solving the two-body problem is not impossible. I have many colleagues who have done it, either through spousal hires, fortuitous job opportunities, extended long-distance relationships, or various degrees of compromise. It takes sacrifice, luck, and, often, institutional support. But couples just beginning a relationship while building two academic careers might find the odds stacked against them. Even ignoring for a moment the fact that a no-compromise work-obsessed lifestyle is still considered a virtue in many institutions, academic careers are structurally best suited to people with no relationships or dependents, who travel light and have their passports at the ready.

It varies by field, but for physics and astronomy, a “typical” tenure-track career path looks something like this: 4-6 years in grad school, a postdoctoral fellowship for 1-3 years, then usually another (and maybe another), all followed by a tenure-track or permanent job, which may or may not be the job you end up in for the long-term. There’s no guarantee all these steps will be in the same country – very often they are not. For me, it’s been an international move every time so far, and it’s very possible the next one will be, too. When I took up my first postdoc, I left my country of origin, most of my worldly possessions, all my friends and family, and a committed relationship, to start all over in England. When I took up my second postdoc, I left my newly built life in England and another committed relationship to start all over yet again on the other side of the world. I’ve moved internationally several times chasing the prospect of permanent academic employment. I have yet to convince anyone to come with me.

I’m not trying to convince anyone that avoiding academia or refusing to move around the world is the key to solving all relationship problems. Anyone can be unlucky in love, even if they stay in the same city their entire lives. But academic shuffling is particularly hostile to romance. The short-term contracts mean that when you arrive in a new country, if you’re interested in finding a long-term partner, you have something like two years to identify and convince a person you’ve just met to agree to follow you wherever you might end up in the world, and you won’t be able to tell them where that will be. If you happen to have different citizenships (which is likely), you have to take into account immigration issues as well – your partner may not be able to follow you without a spousal visa, which can mean a rather hasty life-long commitment, or, depending on the marriage laws of the country in question, a total impossibility. I had a friend in grad school who, at the end of her PhD, faced a choice between living with her wife in Canada, and becoming a tenure-track professor at one of the most prestigious research universities in the USA.

The timing doesn’t help, either. The postdoc stage, when you’re doing your best impersonation of a human pinball, usually comes about in your late 20s or early 30s. It’s a time when it seems like all your non-academic friends are buying houses, getting married, having babies, and generally living what looks like a regular grown-up life. Meanwhile, chances are you’re residing in a single room in a short-term rental, wondering which country you’ll be living in next year. If you’re a woman, you might be keeping an eye on the latest research on fertility in older mothers, and mentally calculating how long you actually need to know someone before deciding to reproduce with them, because by the time you’re in one place long enough to think about settling down you’ll be, at best, pushing 40.

There are lots of ways to make it all work out, of course. You could refuse to date other academics, and instead make sure you’re spending enough time on hobbies outside of the university to attract someone’s interest, while making sure you have a REALLY good pitch about the joy of imminent mystery relocation. You could date another academic, and resign yourself to a relationship that will probably be long-distance for far longer than it was ever face-to-face, with no guaranteed reunion in sight. For this option, make sure that you have lots of disposable income for plane tickets and that neither of you is committed to spending too much time inside a lab. You could swear off serious dating altogether until you’re getting close to landing a permanent job, then negotiate with your future employer for a spousal hire, with the necessary career compromise that will be required of one or both of you to be at that particular institution.

Or you could just wait till you’ve scored a permanent faculty job somewhere, probably in your mid-to-late 30s, and (if you’re a woman) hope that you meet someone soon enough that starting a family is still an option. (As a side note, my late-thirties single straight female friends tell me that men who want babies won’t date women over 35. Obviously this is an unfair and unscientific generalization, but the point is that there are societal pressures that women face when they choose to put off the prospect of families until they have permanent jobs.) If you choose this option, you might also want to keep in mind that a tenure-track job isn’t necessarily permanent, and having a child before having tenure is one of those options that the aforementioned article had a few things to say about.

Or you could decide to prioritize where you want to be (or who you want to be with), and, more likely than not, end up severely limiting your career progress and/or leaving academia altogether. If one or the other partner does have to make a big career sacrifice, gender norms will suggest that, if you’re a woman, the one to make the sacrifice really ought to be you.

As for me, I confess I haven’t figured it out. I have two years left on my contract in Australia and no idea whatsoever which country I’ll end up in next. I’m applying broadly, and there’s no guarantee I’ll have a choice about location if I want to stay on the path toward becoming tenure-track faculty at a major research institution. When it’s not unusual for a single postdoc job to have 300 applicants, and faculty jobs are even more selective, getting even one offer is considered a huge win.

I don’t know if there’s a solution. Having a pool of early-career researchers who move frequently to different institutions unquestionably advances research and keeps the ideas flowing. It is also usually great for the development of postdocs’ research abilities, exposing them to new ideas and work styles. But the prospect of a nearly decade-long period of lifestyle limbo between graduate studies and the start of the tenure track is, understandably, a significant discouragement to many fine researchers who might otherwise bring their unique insights to the field. And, statistically, more of these lost researchers are likely to be women. It may not be the dominant force keeping women out of science or academia, and it may not affect all women, but any slight statistical skew that disadvantages women more than men contributes to the inequality we see. And that makes academia a little bit more lonely for everyone.

logo_square