Wednesday, July 20, 2011

Hauser Resigns After Research Fraud

Test Tubes

Marc Hauser has resigned from Harvard, following the discovery that he falsified experimental data in at least three papers. He is going to work "on the educational needs of at-risk teenagers" (which is really laying it on with a trowel; couldn't he just "want to spend more time with his family" or something?)

Why would a lauded, even lionized researcher of such stature commit fraud? Only he knows, of course, but I suspect his towering status has more than a little to do with it. Author of multiple books, leader of his own lab in one of the most prestigious research places in the world, greeted as a celebrity wherever he goes, half his own staff and students seeing him as a hero - the pressure to perform, to keep getting results, must have been immense. No matter what, he is still only a fallible human, and asking him to be superhuman, to be perfect, is not realistic and not fair.

Very few people are born cheaters. The vast majority of researchers enter their field because they genuinely want to understand the world around them, and share their finds with the world. If money and fame is your goal then science is not the right field for you. I doubt most cheaters start by outright, large-scale fraud, and I doubt Hauser did either.

But I imagine that when failure is no longer an option - when your career, your status, your own livelihood and the future careers of people under your charge all depend on your success - people will all too often start down a slippery slope: clean up a fuzzy picture just a little, omit those obvious outliers, perhaps rerun an experiment that just failed to reach significance. The pressure doesn't let up, and once you've started it's easy to take just another small step, and another... One day you wake up and realize you've become something you used to despise.

How to stop this? We can't stop hero-worshipping, and research funding agencies understandably want to support successful research over failures. It's one thing to tell people that it's OK to fail, but the reality is that a failed project is a real handicap when funding is as cut-throat as it is today. Better oversight and a culture of transparency would help - it should always be OK to talk about odd events or suspicious data in your lab, and a whistleblower should ideally be able to count of the full support of their university. Reminding people what is and is not good research practice throughout their training is another good idea; almost all people want to do things correctly, so you want to stop them from entering that slippery slope before they even realize it.
 
As for Hauser himself? He may or may not return to academia in some years. But with this resignation - and the lingering doubt of all his earlier publications; he is unlikely to have started with full-blown fraud - I don't expect him to ever return to active, high-profile research ever again.

2 comments:

  1. Its a fact. Researchers will more often find dead ends and failures than true a well documented certified achievements. I understand him and why he did it...not saying is good...but understandable

    ReplyDelete
  2. [Apology for this very late answer]

    And still, most of us find nothing but dead ends and yet do not cheat or cut corners.

    I think there's two parts to things like this: first, there's the outside pressures, the outside situation that makes somebody do something bad. Without that pressure they would likely not have acted the way they did.

    But the other is that only a subset of people act badly given these pressures. Only a subset have the "roomy conscience", the lack of empathy, the overriding desire to be seen a success that enables them to cross a line most of their peers avoid.

    Understanding is not excusing, for this very reason.

    ReplyDelete

Comment away. Be nice. I no longer allow anonymous posts to reduce the spam.