Thursday, January 20, 2011

academic ethics in research

Office of Graduate Studies - Tufts University

Sherpas in the Labs

Emerging scientists and engineers learn to navigate foggy, treacherous ethical terrain.
by Johanna Schlegel

Where do scientists and engineers learn right from wrong? Is it a binary construct, like machine language, or more relative, like humidity or barometric pressure?

And does the highly competitive funding process justify a situational approach to ethics—just as the unwritten conventions for doing business in different cultures almost mandate, say some, an ethical pragmatism? Is guiding graduate students, postdocs, and new faculty through these professional gray areas the job of governments, faculty, peers, mentors—or all of the above?

"Murky and Obscure"
"I still don't know how faculty learn to manage or how students learn ethical conduct," said Livia Racz, division leader of the Advanced Hardware Division at Draper Laboratories and former professor of mechanical engineering at Tufts University. Racz, who continues to advise School of Engineering (SOE) graduate students through Draper's fellowship program, is both a recipient and an awarder of research funding.

"As adults, we are expected to make important decisions each day," said Lynne Pepall, dean of the Graduate School of Arts and Sciences. "While students are in the learning mode of their lives it would be best for all of us in the academy to teach students ethical conduct in the same way we teach other vitally important concepts."

Current Graduate School of Arts and Sciences (GSAS) and SOE students in science and engineering disciplines described an environment in which, according to one anonymous Ph.D. candidate in biomedical engineering, "things are very murky and obscure," not to mention inconsistent.

Angela Zapata, G00, chemistry, concurred. "Students learn to not explicitly falsify data or to plagiarize, but interpreting data and paraphrasing can be a fuzzy line."

According to a study available from, "Responsible conduct of research (RCR)—especially for early career scientists—is much more of a gray area than just 'doing the right thing' and in fact constitutes all the small decisions made during the research activities of every day. Some of the common decisions faced by postdocs may not be so clear-cut, such as: Who should be first author on your first lab paper? Can you Photoshop® your publication images to make them easier to interpret? What role should your adviser play in your new collaborations outside the lab? Such questions can be harder to answer, especially without good training or mentorship." 1

Ranjith Anand of the Ph.D. program in biology said, "I was fortunate to have a principal investigator (PI) who followed the general rules of professional conduct."

It No Longer Stays in Vegas
Professor Sheldon Krimsky of the GSAS Department of Urban and Environmental Policy and Planning asserts that there are conflicts of interest inherent in many, if not most sponsored research projects. In Science in the Private Interest: Has the Lure of Profits Corrupted Biomedical Research? (Rowman & Littlefield, Lanham, Maryland, 2003), Krimsky argues that academic freedom and free enterprise are strange bedfellows at best, and at worst the cause of a slow erosion of integrity at every research institution that has ever accepted corporate dollars.

As a practical matter, it's unclear how many academic institutions would still be standing if not for sponsored research. Yet the pace of change, the increase of corporate funding in both real-dollar and percentage terms, and the globalization of the academy have created a perfect ethical storm: who works for whom, who defines the rules of engagement, and what is it exactly that one should do in Rome that's different from what one should do anywhere else, scientifically speaking?

A few years ago, the National Science Foundation (NSF), taking a cue from the National Institutes of Health (NIH), decided that what one does in Rome or what happens in Vegas is nobody's business...unless it's done by U.S. researchers backed by NSF or NIH money. Effective earlier this year, academic institutions like Tufts that submit proposals for NSF funding must provide RCR training to undergraduate and graduate students and postdocs who will apply for that funding. In essence, the government is mandating a piece of the science and engineering curriculum in every academic research institution in the United States.

"The government believes it has to lead the way to make ethics programming part of graduate education," said Dean Pepall. "But I'd like us to ask—not just because the NSF said so—'In this day and age, what does it mean to be a community? What does it mean to be a responsible member of society and contribute to the public good?' It requires a conversation."

It requires a conversation, say NSF and NIH (as interpreted by Columbia University via the outline of its RCR training), about research misconduct; conflicts of interest; publication practices and responsible authorship; data acquisition, management, sharing, and ownership; mentor/trainee responsibilities; peer review; collaborative science; working with human and animal subjects as well as stem cells; and the scientist as citizen. (see

At Tufts, the conversation is underway. According to Peggy Newell, vice provost, "The responsible conduct of research can and should be learned in many different ways. Sometimes it happens in individual mentoring relationships, such as when a graduate student or postdoctoral fellow works in a particular laboratory under the direction of a faculty member; it can happen through formal courses or presentations; and there are many books, articles, and tutorials on the topic. It can happen through deliberate action on the part of a teacher and it can be conveyed in more subtle ways, through observation, inferences, institutional or department culture, and examples. Ideally, it would be discussed in every research setting, not just once in the form of a course requirement but in an ongoing manner as a student or fellow continues to develop as a researcher.

"All of these approaches exist across Tufts' three campuses. There are examples of formal courses, such as a requirement at the Sackler School of Graduate Biomedical Sciences that all graduate students complete a course in the responsible conduct of research, and there are online tutorials available through our website. We are in the process of revising the courses we offer for people who are required to complete one as a condition of receiving funding from the NIH or NSF, and hope to have a new program available in the coming year."

Slicing the Pie
Tufts also has an intellectual property (IP) committee, out of the Office of the President, which puts forward several principles related to the interests of research sponsors, the right of the individual to own the IP resulting from his or her research, and the need to resolve potential commercialization that could create a financial conflict of interest for the researcher. The IP committee is also asked to "recommend a distribution formula for any royalties or other income which may be received from patents or licenses obtained in the name of the University." There's also an Office of Patents and Licensing to help scholars get their work patented.

Regardless of the operating principles, it takes skill to negotiate the terms of sponsored research so as to create a win-win. This is the work of the PI and the research sponsor, and an area in which a neophyte who isn't careful could sign away valuable rights. Or, as a recent case study shows, the negotiation could turn out in both parties' favor.

At Draper, Racz is concerned with the fabrication of MEMS, which are micro-electro-mechanical systems. In a recent contract she awarded, she negotiated with a team in Florida to develop a dialectic nanocomposite with a low thermal expansion coefficient and low cure shrinkage. The recipients wanted the scope of work to include magnetic ones.

Although this type of nanocomposite is of less interest to Draper, Racz agreed that 25 percent of the funds could be used in this way. Draper has unlimited, but not exclusive use of the resulting technology; the researchers own the intellectual property.

Some labs have developed a solution to the IP issue. If a researcher decides to commercialize his or her research and profit from it, they must formally leave the lab and vice-versa. These labs may also have formal royalty arrangements meant to keep them competitive with the private sector.

Zapata, the chemistry graduate alumna, noted, "They all take a cut. The breakdown depends on university policies; whether funding is from private versus government sources; and if it's the government, whether any IP will be used for government or commercial purposes."

One Eye on the Clock, One Eye on the Tiller
Depending on the funding source—contracts versus grants—different regulations apply. Zapata described the difference: "The contract is awarded with the agreement that objectives and deliverables will be met within the schedule; whereas the grant is more loosely executed, and meeting objectives can be satisfied with showing progress within the grant's timeframe. Contracts are more stringent."

A scientist and electrical engineering graduate alumnus who works in a large lab in Massachusetts explained how this works in a collaborative research context.

"People from various disciplines often assist each other on multiple programs, and the project or group leaders are responsible for managing the funding distribution for the staff. This is a great feature; it frees the staff to spend more of our time working on technical issues. I have been impressed at how honest people are regarding property and procedures."

At Draper, Racz described the steps she takes to keep research teams within the scope of the project.

"When awarding funding, I ask faculty to supplement their proposals with a scope of work. These usually go back and forth as we establish the budget and a schedule. I ask for weekly conference calls, a semiannual meeting, and other management measures to reinforce accountability."

Graduate students may or may not feel the full intent of these negotiations.

"Often the PI gives a graduate student full power over any project or grant they are working on," said the anonymous biomedical engineering student. "There might be some meetings where the PI checks in, but the researcher is entrusted to do an adequate job performing and documenting the research."

There's an implicit trust in the culture. But is it well placed? If a team reaches conclusions at odds with published findings, for example, is every professor equally likely to encourage that team to continue with that line of inquiry?

Power Plays and Ground Rules
"Advisers hold so much power," said English graduate alumna Carmen Lowe, G95, G03, director of the Academic Resource Center and interim dean of undergraduate studies.

"They could easily take advantage of graduate students' scholarship or steal their intellectual property."

One possible fall out from this imbalance of power is the case of Marc Hauser. Last summer, Harvard University found Hauser, a professor of psychology and director of the Cognitive Evolution Laboratory, guilty of eight instances of scientific misconduct involving three published papers and five additional experiments. While Harvard's findings have certainly damaged the career of Professor Hauser, his actions also had a profound impact on his researchers, as conveyed in a Chronicle of Higher Education article titled, "Document Sheds Light on Investigation at Harvard." The article's author, referencing an internal Harvard University document, writes of an environment in which "research assistants became convinced that the professor was reporting bogus data and how he aggressively pushed back against those who questioned his findings or asked for verification." Although no decision has been made regarding Professor Hauser's future at Harvard, at least one of his former researchers, as reported by the Chronicle, has left the field of psychology entirely. In the internal document, this individual summed up the precarious position graduate students can find themselves in when he noted that "the most disconcerting part of the whole experience to me was the feeling that Marc was using his position of authority to force us to accept sloppy (at best) science" (Tom Bartlett, "Document Sheds Light on Investigation at Harvard," The Chronicle of Higher Education, online edition, August 19, 2010).

While the story of Marc Hauser is an extreme example of what can happen when an adviser abuses his or her power, there are other concerns facing scholars—namely publications and authorship, a sticky area for advisers and graduate students alike.

The NSF strongly advises those who submit proposals for NSF funding to determine how many papers will likely result, whose names will appear, and order of authorship. The intent is to mitigate problems that often arise from determining legitimate ownership of research results.

"The professor has the grant, students step in late in the game, and sometimes it can be difficult to determine whose research is whose," said Lowe. "When an adviser plagiarizes from a graduate student or takes credit for his or her work, the student is powerless; to take steps would be to destroy one's career."

This plays both ways: a journalist pursuing an agenda or not understanding the conventions of publication and authorship could publish a story giving disproportionate credit to a junior researcher who made "great copy," at the expense of that researcher's PI, adviser, or teammates.

It's clear that establishing ground rules for publication and authorship up front can protect every research team member's work. Nevertheless, projects evolve; and sometimes the rules of engagement need to be renegotiated.

Ranjith Anand of the biology program said, "When we submitted our research paper, I was listed as second author; the graduate student in our collaborator's lab was first. By the time the paper had gone through the peer review process, one year had gone past. During this period, I was the person responsible for conducting the remaining experiments. I thought it was fair to ask for equal first authorship, to which our collaborators agreed. I don't think this was a tough negotiation. However, I can imagine a situation where the negotiations could have taken a wrong turn."

Interested, Yet Disinterested
"The main [ethical] issues we face," the electrical engineering graduate alumnus said, "are avoiding conflicts of interest, and our lab has been diligent in educating us about this issue. Since the lab participates in a variety of projects involving many different sponsors, employees must remain impartial and disinterested in the outcomes of our work," even if the answers are unfavorable.

Scientific conflicts of interest can take multiple forms. For example, a financial conflict of interest would arise if a researcher who held a commercial stake in the outcome of a project became predisposed to ignore unfavorable results—that is, if he or she no longer remained impartial or "disinterested" in the Mertonian sense (See "Is Merton Still Relevant?").

Peer review also presents potential conflicts, as when personal networks enable scholars to propel proposals or papers onto a fast track where current or former colleagues or cronies don't do their usual due diligence.

Krimsky adds two others: "In clinical research, there is a concern that the physician's interest in an experimental patient treatment may compromise patient care by downplaying risks." Also, he asserts, "Clinical researchers may appropriate body tissue or genetic information as intellectual property from people who come under their care."2

One could argue that Harvard Professor Marc Hauser's vulnerability was his inability to remain "impartial or disinterested." The first of many dominoes to fall before the professor was an experiment in which the ability of rhesus monkeys to recognize sound patterns was tested. The professor and a research assistant coded the experiment's data, drawing vastly different conclusions: Hauser found that the monkeys noticed changes in sound patterns; his research assistant did not. A second research assistant analyzed both results and found, like his fellow graduate student, that nothing indicated that the monkeys recognized the changing patterns. In fact, according to the internal document provided to The Chronicle of Higher Education, "they looked at the speaker more often when the pattern was the same...the experiment was a bust." (see "Document Sheds Light on Investigation at Harvard," The Chronicle of Higher Education.) In the case of Professor Marc Hauser, it appears that "disinterest" was lost.

Been Caught Stealin'
Another article in The Chronicle of Higher Education reported this statistic: "The latest surveys by the Center for Academic Integrity found that 22 percent of students say they have cheated on a test or exam, but about twice as many—43 percent—have engaged in 'unauthorized collaboration' on homework" (Jeffrey R. Young, "High-Tech Cheating Abounds, and Professors Bear Some Blame," The Chronicle of Higher Education, volume LVI, number 29, April 2, 2010).

"Undergrads try to get away with a lot more with teaching assistants (TAs) than with professors," said Interim Dean Lowe. "A lot of grad students never think much about academic integrity until they find themselves in the role of a TA, proctoring exams, and grading papers. We try to give them a heads-up about what the TA's role is and how to conduct it."

When a TA becomes aware of cheating—whether it's a student buying a paper online or copying a fellow student's lab report—the teaching assistant must immediately bring this matter to the attention of the professor, faculty adviser, or department chair as stipulated in the Graduate School of Arts and Sciences and School of Engineering TA handbook.

Cheating in science classes—especially in labs, where how you derive the answer is as important as getting the "right" answer—defeats the purpose of learning to become an investigator. This is a tough lesson for premedical students in foundational undergraduate biology courses, said Lowe, because their high school environments emphasize correct answers and high scores.

Has the Fast Pace of Science...Slowed the Pace of Science?
"It is essential to the very meaning of a university that, among its many concerns, involvement with and concern for significant controversial contemporary issues should be given a high order of priority. We must constantly ask, of what relevance to the crucial, life-and-death issues confronting modern society is our work and effort," according to the Tufts faculty handbook for Arts, Sciences, and Engineering.

Yet research results are becoming more difficult to repeat and objective peer reviews more elusive; and the U.S. patent office is organized for last century's discoveries. "Crucial, life-and-death issues" can thus be difficult to fund.

Racz of Draper Labs explained why. "It's hard to get truly objective scientific reviews. If your research is too far-out, no one believes you, so either you plug away for years and it will catch on, or not. One of my peers spent years on a facial recognition algorithm, and nobody cared. However, someone saw her work recently and offered her a job on the spot. Another friend has been doing research on aging for years, and that's now a popular subject. But many scientists who work on big problems are never successful."

She explained why so much research these days is incremental. "Private industry is most concerned with applied, results-oriented research. It is difficult to be competitive if you do not show direct correlation of your proposed work to a profitable outcome. Federal government contracts are also most concerned with producing knowledge/products that will be quickly applied in the field. Both types of funding want a big return for a 'small' investment. Federal grant money enables both basic and applied research, although you must show a significance to a major national or global problem. Even though this sounds like the ideal funding source for investigating 'deep thoughts', it must still show a feasible, quick road to application, and it is very network-oriented!"

"I have to say, though," reflected Racz, "that some of my best foundational work was funded by Intel. It involved flat section wafers of the smallest scale. We teamed up with tribology during the early stages of commercial product development; Intel funded us and some others." (Tribology, according to Wikipedia, is "the science and engineering of interacting surfaces in relative motion.") Mechanical engineering professor Chris Rogers played a role, helping bring the parties together.

Merton, Murkiness, and Mandates: Where To Next?
"Presenting our work at group meetings, writing papers, discussing science with other students, sharing protocols and ideas and doing collaborative projects are the best ways that Tufts reduces scientific ethical issues," said the biomedical engineering student.

"I've learned how to collaborate, manage projects, write papers, edit and critique others' work and my own, and keep high ethical standards of research by working with many postdocs in the lab and from collaborators with whom I work very closely, designing experiments and analyzing data."

"Ideally," said Vice Provost Newell, "education on responsible conduct of research would be tailored to the general type of research the student is likely to pursue in his or her career and sufficiently relevant to allow the person to understand what it would mean in the context of his or her work. The issues can be difficult, the possible scenarios can be varied and complicated, and there are more gray areas than clear-cut answers. Having an opportunity to think through different cases, to discuss them and hear different perspectives, and to come to some conclusions as to what are the standards in the field is critical to the development of future researchers. Research settings that permit and encourage this kind of discussion and research mentors who are mindful of this aspect of research training are a great complement to formal courses."

Does Merton still matter? Can science and engineering get "back to basics"? In his book, Krimsky expands on the ideas of Merton, hoping to answer these very questions.

He writes of universalism as the idea that "[truth] has no national boundaries"; communalism as "the common ownership of the fruits of scientific investigation"; disinterestedness as the requirement "that scientists apply the methods, perform the analysis, and execute the interpretation of results without considerations of personal gain, ideology, or fidelity to any cause other than the pursuit of truth"; and organized skepticism as the construct that "truth is not the default state of authoritative claims, but the result of applying socially established rules of inquiry to a claim that stands up to critical scrutiny and is subject to reexamination." (Science in the Private Interest, pages 75–79). Together, these operating principles are not only practical, but necessary in an evolving scientific world.

Krimsky is right to imply that corporations or governments should not be allowed to place conditions on funding that run counter to these principles. But the word "organized" may hold the answer for our time—because science and engineering are fundamentally social disciplines. So say not just Merton, decades ago; nor only Krimsky, in a scholarly book; nor merely the researchers and academic leaders we spoke with. Even the architects transforming 200 Boston Avenue, Medford, into a "cluster" and their Tufts "clients" are reasserting, as they revisit what a laboratory needs to be, that the work of scientists and engineers in the twenty-first century is far more collaborative, interdisciplinary, and inter-institutional than Merton could have imagined.

So, one ethical individual does not constitute an ethical profession. But each ethical individual does matter. Because science and engineering—professions that purport to advance what a global society understands to be true—reach their greatest successes when all of their practitioners collude for truth, not for one another...sponsors notwithstanding.

Johanna Schlegel is a freelance writer and marketing consultant in greater Boston.

1 Flint, Kathleen, Responsible Conduct of Research Toolkit: Tools for developing programs on responsible conduct of research for postdocs, National Postdoctoral Association, June 29, 2010.

2 Krimsky, Science in the Private Interest, page 130.

Illustrations by Leigh Wells

Printed from: Sherpas in the Labs

No comments: