Event Recap: Intellectual Property Panel “From Research to Patent”


by Adrian Rivera-Reyes

On November 10th, the Penn Science Policy Group and the Penn Intellectual Property Group at Penn Law co-hosted a panel discussion focused on intellectual property and how to patent scientific research. The panel included Peter Cicala, Chief Patent Counsel at Celgene Corp.; Dr. Dora Mitchell, Director of the UPstart Program at the Penn Center for Innovation (PCI) Ventures; and Dr. Michael C. Milone, Assistant Professor of Pathology and Laboratory Medicine at the Hospital of the University of Pennsylvania (HUP), and Assistant Professor of Cell and Molecular Biology at Penn Medicine.

The event started with the introduction of both groups by their respective presidents and was proceeded by Kimberly Li giving an introduction of the panelists. Next, Peter gave a short PowerPoint presentation with a general introduction of intellectual property. Below are some key points to understand intellectual property/patent law 1,2:

1) In general, patents provide a “limited monopoly” that excludes others from making an invention, using, offering for sale, selling, or otherwise practicing an invention, but it does not confer upon the patentee a right to use the said invention. Thus, patents serve as a form of protection for the owner.
2) A single invention can only be patented once; once the patent on that invention expires, others may not file to patent the same invention again.
3) In order to confer a patent, the United States Patent and Trademark Office ensures that inventions of patentable subject matter meet the following legal requirements: i) inventions must be novel, ii) inventions must be useful, and iii) inventions must be non-obvious.
4) Utility patents only last for 20 years from the date of filing. After 20 years, anyone can make, use, offer for sale, sell, or practice the invention. A single invention cannot be re-patented after the time is done. In contrast, trademarks or trade secrets last forever, and copyrights last for the lifetime of the author.  
5) The United States Patent and Trademark Office follows the ‘first to file’ rule. Thus, the first person or entity to file a patent is the assumed owner.
6) Patents can be invalidated by the United States Patent and Trademark Office.

A clever example discussed by Peter Cicala was the patenting of a new car feature. If X company has submitted and received a patent for a car and Y company makes a new feature for the car, they can patent the new feature (as long as it meets the legal requirements introduced above). Once the patent for the new feature is conferred to Y company then they can produce that one feature, but not the car that was patented by X company, unless a license is provided by X company to Y company. Thus, the patent for Y company only gives them the power to prevent others from making that new feature.

Conferring Patents in the US and Internationally

First, there has to be an invention of some sort. Once there is an invention, a patent is filed. Patents are drafted free-hand, unlike a tax application where one has a specific form to fill. For patents, one has to start from scratch. Patents are usually long (some can reach 500 pages in length) and there are many legal requirements on what to say in the application and how to say it. Eventually, when one files a patent application it will go to the patent office. A patent examiner will, as the name suggests, examine it and deliberate with the patent office over the course of 3-5 years as they point out sections that need further editing, clarification, or justification. There is a lot of back and forth, until the examiner agrees that the invention has satisfied the patent requirements. Then, one pays fees and the patent is awarded. Fun fact: In the US, patents are granted only on Tuesdays.

On a global basis, one files a single international patent and the designated patent offices around the world examine it locally. If an office grants a patent, such patent will only be valid in that jurisdiction. That is why submitting patents cost so much, because one files and pays legal fees for each jurisdiction. For example, if a patent is filed in Japan for a compound, a different entity can manufacture the compound freely in the US, but not in Japan. This is one reason why companies and universities are very careful when filing patents.

Intellectual Property in Industry

Pharmaceutical products start with a great idea, but for every product in the market there are about 10,000 that fail. Therefore, companies file many patents even though many of those patents may not have any commercial value in 5-6 years. It costs about $500K to file (including filing and attorneys’ fees) and receive a single issued patent, which means companies spend a lot in patents (i.e. 10,000 patent submissions each worth $500K)! Out of those 10,000 patents, typically one will make the company about an estimated $5 billion a year in returns.

A student asked, “Is submitting a patent the same price for a university as it is for a company?” In essence, no! The patent office makes a distinction between large and small entities. Small entities, based on requirements provided by the patent office3, pay half the fees, but attorneys charge a fixed price. In the end, small entities save just a small percentage of money. Another question asked by an audience member was “what is patentable in the pharma business?” If one patents a molecule, no one else can infringe or use that molecule itself. That is how companies patent drugs or their associated components. One can also patent dosing regimens, formulations, modes of administration, etc. The compound claim gives the most protection, because it is very hard to make a knock-off of a molecule.

Intellectual Property in Academia

A student raised the issue that there is a lot of communication that occurs in science, especially at conferences, symposia, or amongst colleagues, classmates, etc. That seems to be a big risk in the context of protecting one's intellectual property, but doing so is an unavoidable risk when one does scientific research.

Dora, patent analyst from PCI Ventures, then proceeded to discuss the issues brought up from an academic perspective. She said, “The question raised here is that when one works in an academic institution the work is knowledge based and disseminated to others.... How does one draw the line from all that to protect something valuable?” What most, if not all, academic/research institution do is have their lawyers work very closely with faculty, so that anytime they are about to publish a paper, go to a conference, attend grand rounds, or any other such public appearance, the lawyers will hustle and get an application submitted before such events.

In addition to these more public forums, problems can arise from talking with friends who are not directly associated with the work. An example of this pertains to OPDIVO®, a drug patented by Ono Pharmaceuticals and the Kyoto University in the 90’s, which later was exclusively licensed to Bristol-Myers Squibb who launched the drug. Recently, Dana Farber Cancer Institute sued Ono Pharmaceuticals and Bristol-Myers Squibb because the principal investigator at Kyoto University had periodically consulted a colleague at Dana Farber for his advice. The professor-consultant at Dana Farber would send some data he thought was helpful and consult with them. Dana Farber sued both companies, claiming that the now-retired professor from its institution should be included as an inventor in the patent. Because an inventor of a patent is part-owner, Dana Farber is actually claiming ownership of the patent and will receive compensation from the sales of products under the patent4,5.

Michael, Penn Med professor who works intimately with a team of lawyers from PCI because he regularly files patents, said that balancing confidentiality with science communication is a difficult task. He commented, “I think it comes down to how important one thinks the invention is and a lot of the times the patent will not get developed if it will not bring any money to the owner (company/institution).” Moreover, there has to be a conversation with the university because the university pays for the patent, so it decides what to file. It also depends on the resources of the university. Regarding the work of graduate students or postdoctoral fellows, there are more considerations. Students and postdocs want and need to publish, go to conferences, and present their work in order to move forward with their careers; thus patents can be a rather limiting step for them.

From the industry perspective, Peter clarified that the rule at Celgene is that no one can talk about anything until the patent application is filed. Once the patent application is filed, employees are free to talk to whomever they wish without causing a situation like the one with Dana Farber and Bristol-Myers Squibb, since the patent application has been filed prior to any communication.

Thus, a clear difference between industry and academia is that in industry, things are kept under wraps and then a patent is filed, whereas in academia patents are filed early to make sure that the institution does not lose the rights of patenting by making the information public. Because universities file very early, there is a lot to deal with afterwards. The costs of prosecution are high, and sometimes the application does not make it through the full process, because universities cannot afford to throw $500K for an application if they are not confident on getting a return on the investment. The reason to file for some universities might be purely strategic.

Ownership vs. Inventorship

Another interesting topic discussed, was that of ownership vs. inventorship. There is the notion that ownership follows inventorship. In most cases, people do not file patents on their own; they work for companies or universities. Usually, an employment contract will state that if an employee invents something while employed by that entity, then ownership to a resultant patent will be assigned to the employer. Thus, the person is the inventor but not the owner of the patent; the entity is the owner. For academic research, the Bayh-Dole act was enacted to allow universities to own inventions that came from investigations funded by the federal government6. Dora explained that, “Government officials got together and agreed that they awarded so much money into research and good stuff came out of it, which the government would own but not file patents or do anything with it commercially."

A preliminary list of inventors is written when the patent is filed, but legally the inventors are the people that can point to a claim and say: "I thought of that one." Inventors have to swear under oath that they thought of a particular claim, and need to be able to present their notebooks with the data supporting a claim of inventorship. Inventors are undivided part-owners of the patent, which means that any inventor listed in the patent can license that patent in any way, without accounting for any of the other inventors. Additionally, there is a difference between the people that think about the claims and the people that actually execute the subject matter of the resulting claim. If a person is only executing experiments without contributing intellectually to the idea or procedure, then that person is not an inventor. For those in academic research, this often differs from how paper authorship is decided – usually performing an experiment is sufficient.

Summary

The discussion prompted the researchers in the room to be on the lookout for ideas they have that can result in patents, and to be careful when discussing data and results with people outside of their own research laboratory. Also, the discussion exposed key differences between intellectual property lawyers working for universities and industries, as opposed to law firms that have departments working on intellectual property. Ultimately, students felt they gained a basic understanding on how intellectual property works, the rules to file patents, and some intrinsic differences between academic and industry research.

References:

1) United States Patent and Trademark Office – (n.d.) Retrieved December 11, 2016 from https://www.uspto.gov/patents-getting-started/general-information-concerning-patents
2) BITLAW – (n.d.) Retrieved December 11, 2016 from http://www.bitlaw.com/patent/requirements.html
3) United States Patent and Trademark Office – (n.d.) Retrieved December 20, 2016 from https://www.uspto.gov/web/offices/pac/mpep/s2550.html
4) Bloomberg BNA – (2015, October 2) Retrieved December 11, 2016 FROM https://www.bna.com/dana-farber-says-n57982059025/
5) United States District Court (District Court of Massachusetts). http://www.dana-farber.org/uploadedFiles/Library/newsroom/news-releases/2015/dana-farber-inventorship-complaint.pdf
6) National Institute of Health, Office of Extramural Research – (2013, July 1) Retrieved December 11, 2016 from https://grants.nih.gov/grants/bayh-dole.htm

Event Recap: Anonymous Peer Review & PubPeer

by Ian McLaughlin 

On the 24th of October, the Penn Science Policy Group met to discuss the implications of a new mechanism by which individuals can essentially take part in the peer review process.  The group discussion focused on a particular platform, PubPeer.com, which emerged in 2012 and has since become a topic of interest and controversy among the scientific community.  In essence, PubPeer is an online forum that focuses on enabling post-publication commentary, which ranges from small concerns by motivated article readers, to deeper dives into the legitimacy of figures, data, and statistics in the publication.  Given the current state of the widely criticized peer-review process, we considered the advantages and disadvantages of democratizing the process with the added layer of anonymity applied to reviewers.

PubPeer has been involved in fostering investigations of several scandals in science.  Some examples include a critical evaluation of papers published in Nature 2014 entitled Stimulus-triggered fate conversion of somatic cells into pluripotency [1].  The paper described a novel mechanism by which pluripotency might be induced by manipulating the pH environments of somatic cells.  However, following publication, concerns regarding the scientific integrity of published experiments were raised, resulting in the retraction of both papers and an institutional investigation.
  
Subsequently, the publications of a prolific cancer researcher received attention on PubPeer, ultimately resulting in the rescission of a prestigious position at a new institution eleven days before the start date due, at least in part, to PubPeer commenters contacting faculty at the institution.  When trying to return the professor’s former position, it was no longer available.  The professor then sued PubPeer commenters, arguing that the site must identify the commenters that have prevented a continued career in science.  PubPeer, advised by lawyers from the ACLU working pro-bono, is refusing to comply – and enjoy the support of both Google and Twitter, both of which have filed a court brief in defense of the website [2]. 
                  
Arguably at its best, PubPeer ostensibly fulfills an unmet, or poorly-met, need in the science publication process.  Our discussion group felt that the goal of PubPeer is one that the peer review process is meant to pursue, but occasionally falls short of accomplishing. While increased vigilance is welcome, and bad science – or intentionally misleading figures – should certainly not be published, perhaps the popularity and activity on PubPeer reveals a correctable problem in the review process rather than a fundamental flaw. While the discussion group didn’t focus specifically on problems with the current peer review process – a topic deserving its own discussion [3] – the group felt that there were opportunities to improve the process, and was ambivalent that a platform like PubPeer is sufficiently moderated, vetted, and transparent in the right ways to be an optimal means to this end.
                  
Some ideas proposed by discussion participants were to make the peer-review process more transparent, with increased visibility applied to the reasons a manuscript is or is not published.  Additionally, peer-review often relies upon the input of just a handful of volunteer experts, all of whom are frequently under time constraints that can jeopardize their abilities to thoroughly evaluate manuscripts – occasionally resulting in the assignment of peer review to members of related, though not optimally relevant, fields [4].  Some discussion participants highlighted that a democratized review process, similar to that of PubPeer, may indeed alleviate some of these problems with the requirement that commenters be moderated to ensure they have relevant expertise.  Alternatively, some discussion participants argued, given the role of gate-keeper played by journals, often determining the career trajectories of aspiring scientists, the onus is on Journals’ editorial staffs to render peer review more effective.  Finally, another concept discussed was to layer a 3rd party moderation mechanism on top of a platform like PubPeer, ensuring comments are objective, constructive, and unbiased.
                  
The concept of a more open peer review is one that many scientists are beginning to seriously consider.  In Nature News, Ewen Callaway reported that 60% of the authors in Nature Communications agreed to have publication reviews published [7].  However, while a majority of responders to a survey funded by the European Commission believed that open peer review ought to become more routine, not all strategies of open peer review received equivalent support.

[7]

                  
Ultimately, the group unanimously felt that the popularity of PubPeer ought to be a signal to the scientific community that something is wrong with the publication process that requires our attention with potentially destructive ramifications [5].  Every time a significantly flawed article is published, damage is done to the perception of science and the scientific community, and at a time when the scientific community still enjoys broadly positive public perception [6], now is likely an opportune time to reconsider the peer-review process – and perhaps learn some lessons that an anonymous post-publication website like PubPeer might teach us.

References


1) PubPeer - Stimulus-triggered fate conversion of somatic cells into pluripotency. (n.d.). Retrieved November 25, 2016, from https://pubpeer.com/publications/8B755710BADFE6FB0A848A44B70F7D 

2) Brief of Amici Curiae Google Inc. and Twitter Inc. in Support of PubPeer, LLC. (Michigan Court of Appeals). https://pubpeer.com/Google_Twitter_Brief.pdf

3) Balietti, S. (2016). Science Is Suffering Because of Peer Review’s Big Problems. Retrieved November 25, 2016, from https://newrepublic.com/article/135921/science-suffering-peer-reviews-big-problems

4)Arns M. Open access is tiring out peer reviewers. Nature. 2014 Nov 27;515(7528):467. doi: 10.1038/515467a. PubMed PMID: 25428463.

5) Jha, Alok. (2012). False positives: Fraud and misconduct are threatening scientific research. Retrieved November 25, 2016, from https://www.theguardian.com/science/2012/sep/13/scientific-research-fraud-bad-practice

6) Hayden, E. C. (2015, January 29). Survey finds US public still supports science. Retrieved November 25, 2016, from http://www.nature.com/news/survey-finds-us-public-still-supports-science-1.16818 

7) Callaway E. Open peer review finds more takers. Nature. 2016 Nov 10;539(7629):343. doi: 10.1038/nature.2016.20969. PubMed PMID: 27853233

Event Recap: The Importance of Science-Informed Policy & Law Making

by Ian McLaughlin          

Last week, we held a panel discussion focused on the importance of science-informed policy & law making.  The panel included Dr. Michael Mann, a climatologist and geophysicist at Pennsylvania State University who recently wrote The Madhouse Effect: How Climate Change Denial is Threatening Our Planet, Destroying Our Politics, and Driving Us Crazy.   Dr. Andrew Zwicker, a member of the New Jersey General Assembly and a physicist who heads the Science Education Department of the Princeton Plasma Physics Laboratory, joined him.  Finally, Shaughnessy Naughton, a chemist and entrepreneur who ran for congressional office in Pennsylvania and founded the 314 PAC, which promotes the election of candidates with backgrounds in STEM fields to public office, joined the panel as well.

The event began with personal introductions, with each member characterizing their unique perspectives and personal histories.  Shaughnessy Naughton highlighted the scarcity of legislators with backgrounds in math and science as a primary motivator for encouraging people with science backgrounds to get involved beyond just advocacy. 

Dr. Andrew Zwicker, having previously run for office in the US House of Representatives, ultimately was successful in his run for the state assembly in an extremely tight race, winning by just 78 votes, or 0.2456%  – a level of precision that he’s been told would only be spoken by a scientist, as most would simplify the value to a quarter of a percent.  He credited two primary features of his campaign as contributing to his success.  First, on a practical level, he utilized a more sophisticated voter model.  As the first Democrat ever elected to his district in its 42 years[1], it was critical to optimally allocate resources to effectively communicate his message.  Second, he identified his background in science as a strength.  When campaigning, he made it clear that he’d ensure facts would guide his decisions – and his constituents found that pragmatism appealing.

Next, Dr. Michael Mann summarized his pathway to prominence in the climate change debate by recounting the political fallout that occurred following the publication of his now famous “hockey-stick graph”[2].  In short, the graph depicts that average global temperatures had been fairly stable until 1900 (forming the shaft of the hockey stick), at which point a sharp rise in temperature begins (forming the blade).  In articulating why exactly this publication made such a splash, he highlighted the simplicity of the graph. It summarizes what is otherwise fairly esoteric data in a way that’s accessible to non-scientists.  “You don’t have to understand the complex physics to understand what the graph was saying: there’s something unprecedented taking place today, and, by implication, probably has something to do with what we’re doing.”  After its publication, he was in for a whirlwind.  The graph became iconic in the climate change debate, provoking the ire of special interests who then pursued a strategy to personally discredit Mann.

Naughton initiated the conversation by asking Zwicker if his background in science has influenced what he’s been able to accomplish in his past 9 months of public office.  While at times it has given him credibility and garnered trust among his peers and constituents, the nature of science is often incongruous with politics: rather than relying solely on facts, politics requires emotional and personal appeals to get things done.  A specific example: the fear of jobs being lost due to legislation, particularly reforms focused on energy and climate change, oftentimes obscures what would otherwise be a less volatile debate.

Naughton then asked Mann to describe his experience with Ken Cuccinelli, the former Attorney General (AG) of Virginia under former governor Bob McDonnell.  One of the former AG’s priorities was to target the Environmental Protection Agency’s ability to regulate greenhouse gas emissions, as well as demand the University of Virginia – the institution where Dr. Mann had been an assistant professor from 1999 to 2005 – to provide a sweeping compilation of documents associated with Dr. Mann.  Cuccinelli was relying upon the 2002 Virginia Fraud Against Taxpayers Act, devised to enable the AG to ferret out state waste and fraud, to serve the civil investigative demand.  Ultimately, Cuccinelli’s case was rejected, and has since been considered a major victory to the integrity of academic research and scientists’ privacy.

The panel then invited questions from attendees, which ranged from technical inquiries of how climate estimates were made for the Hockey Stick Curve to perspectives on policy & science communication. 

One question focused on the public’s ability to digest and think critically about scientific knowledge – highlighting that organizations and institutions like AAAS and the NSF regularly require funded investigators to spend time communicating their research to a broader audience.  However, the relationship between the public and science remains tenuous.  Zwicker responded by identifying a critical difference in efficacy between the beautiful images and data from NASA or press releases and the personal experiences of people outside of science.  Special interest groups can disseminate opinions and perspectives that don’t comport with the scientific consensus, and without truly effective science communication, the public simply can’t know whom to trust.  He argued that scientists do remain a broadly trusted group, but without competent efforts to communicate the best science, it remains a major challenge.  Ultimately, the solution involves a focus on early education and teaching critical thinking skills.

Moreover, Mann commented on a problematic fallacy that arises from a misunderstanding of how science works: “there’s a fallacy that because we don’t know something, we know nothing.  And that’s obviously incorrect.” There are many issues at the forefront of science that remain to be understood, but that forefront exists because of relevant established knowledge.  “We know greenhouse gasses warm the planet, and it’ll warm more if we continue burning carbon.  There’s still uncertainty with gravity.  We haven’t reconciled quantum mechanics with general relativity.  Just because we haven’t reconciled all of the forces, and there’s still something to be learned about gravity at certain scales – we still understand that if we jump out the window, we’ll plummet to our deaths.”

Naughton suggested that much of this disconnect between scientific knowledge and public sentiment comes down to communication.  “For many scientists, it’s very difficult to communicate very complex processes and theories in a language that people can understand.  As scientists, you want to be truthful and honest.  You don’t learn everything about quantum mechanics in your first year of physics; by not explaining everything, that doesn’t mean you’re being dishonest.” 

Zwicker highlighted that there aren’t many prominent science communicators, asking the audience to name as many as they could.  Then, he asked if we could name prominent female science communicators, which proved more difficult for the audience.  There isn’t necessarily a simple solution to this obvious problem, given the influence of special interests and concerns of profitability.

An audience member then asked whether the panelists considered nuclear energy a viable alternative – and, in particular “warehouse-ready nuclear”, which describes small modular reactors that operate on a much smaller scale than the massive reactors to which we’ve become accustomed.  Zwicker, as a physicist, expressed skepticism: “You’ll notice there are no small reactors anywhere in the world.  By the time you build a reactor and get through the regulation – and we’re talking 10-30 years to be completed – we’re still far away from them being economically viable.”  He also noted that he’s encountered the argument that investment allocation matters to the success of a given technology, and that investment in one sustainable energy platform may delay progress in others.  The audience then asked about the panel’s perspectives on natural gas, which is characterized by some as a bridge fuel to a lower carbon-emitting future energy source.  Summarizing his perspective on natural gas, Mann argued “a fossil fuel ultimately can’t be the solution to a problem caused by fossil fuels.”

Jamie DeNizio, a member of PSPG, asked if the panel thought coalitions between state and local governments could be an effective strategy to get around current barriers at the national level.  Naughton noted that this is ultimately the goal behind the federal Clean Power Plan, with goals tailored to specific states for cutting carbon output.  Mann, highlighting the prevalent lack of acceptance of climate change at the federal level, suggested that the examples of state consortia that currently exist – like The Regional Greenhouse Gas Initiative (RGGI) in New England, or the Pacific Coast Collaborative (PCC) on the West Coast – are causes for optimism, indicating that progress can be made despite gridlock at the federal level.  Zwicker noted that New Jersey’s participation in trading carbon credits had resulted in substantial revenue, as New Jersey was able to bring in funds to build a new hospital.  He suggested that Governor Chris Christie’s decision to withdraw from RGGI was imprudent, and the New York Times noted that, in 2011, New Jersey had received over $100 million in revenue from RGGI[3].

Another issue that was brought up by the panel was how counterproductive infighting among environmentalists and climate change activists can be to the overall effort.  In particular, this splintering enables critics to portray climate change as broadly incoherent, rendering the data and proposals less convincing to skeptics of anthropogenic climate change.

Adrian Rivera, also a PSPG member, asked the panel to comment on whether they felt social media is an effective strategy to communicate science to the general public.  Mann stated that scientist that do not engage on social media are not being as effective as they can be, mostly because there is a growing subset of the population that derives information via social media platforms. In contrast, Zwicker highlighted the lack of depth on social media, and that some issues simply require more in-depth discussion than social media tends to accommodate. Importantly, Zwicker emphasized the importance and value of face-to-face communication. Naughton then brought this point to a specific example of poor science communication translating into tangible problems.  “It’s not all about policy or NIH/NSF funding.  It’s about making sure evolution is being taught in public schools.”  She noted the experience of a botany professor in Susquehanna, PA, who was holding an info-session on biology for high-school teachers. One of the attending high-school teachers told him that he was brave for teaching evolution in school, which Naughton identified as an example of ineffective science communication.

Finally, an environmental activist in the audience noted that a major problem he’d observed in his own approach to advocacy was that he was often speaking through feelings of anger rather than positive terms.  Mann thoroughly agreed, and noted that “there’s a danger when we approach from doom and gloom.  This takes us to the wrong place; it becomes an excuse for inaction, and it actually has been co-opted by the forces of denial.  It is important to communicate that there is urgency in confronting this problem [climate change] – but that we can do it, and have a more prosperous planet for our children and grandchildren.  It’s critical to communicate that.  If you don’t provide a path forward, you’re leading people in the wrong direction.”

The event was co-hosted by 314 Action, a non-profit affiliated with 314 PAC with the goal of strengthening communication among the STEM community, the public, and elected officials.


References:

1. Qian, K. (2015, November 11). Zwicker elected as first Democrat in NJ 16th district. Retrieved October 6, 2016, from http://dailyprincetonian.com/news/2015/11/zwicker-elected-as-first-democrat-in-nj-16th-district/

2. Mann, Michael E.; Bradley, Raymond S.; Hughes, Malcolm K. (1999), "Northern hemisphere temperatures during the past millennium: Inferences, uncertainties, and limitations" (PDF), Geophysical Research Letters, 26 (6): 759–762, Bibcode:1999GeoRL..26..759M, doi:10.1029/1999GL900070

3. Navarro, M. (2011, May 26). Christie Pulls New Jersey From 10-State Climate Initiative. Retrieved October 6, 2016, from http://www.nytimes.com/2011/05/27/nyregion/christie-pulls-nj-from-greenhouse-gas-coalition.html?_r=1&ref=nyregion

Event Recap: Dr. Sarah Rhodes, Health Science Policy Analyst

by Chris Yarosh

PSPG tries to hold as many events as limited time and funding permit, but we cannot bring in enough speakers to cover the range of science policy careers out there. Luckily, other groups at Penn hold fantastic events, too, and this week’s Biomedical Postdoc Program Career Workshop was no exception. While all of the speakers provided great insights into their fields, this recap focuses on Dr. Sarah Rhodes, a Health Science Policy Analyst in the Office of Science Policy (OSP) at the National Institutes of Health (NIH).

First, some background: Sarah earned her Ph.D. in Neuroscience from Cardiff University in the U.K., and served as a postdoc there before moving across the pond and joining a lab at the NIH. To test the policy waters, Sarah took advantage of NIH’s intramural detail program, which allows scientists to do temporary stints in administrative offices. For her detail, Sarah worked as a Policy Analyst in the Office of Autism Research Coordination (OARC) at the National Institute of Mental Health (NIMH). That experience convinced her to pursue policy full time. Following some immigration-related delays, Sarah joined OARC as a contractor and later became a permanent NIH employee.

After outlining her career path, Sarah provided an overview of how science policy works in the U.S. federal government, breaking the field broadly into three categories: policy for science, science for policy, and science diplomacy. According to Sarah (and as originally promulgated by Dr. Diane Hannemann, another one of this event’s panelists), the focus of different agencies roughly breaks down as follows:


This makes a lot of sense. Funding agencies like NIH and NSF are mostly concerned with how science is done, Congress is concerned with general policymaking, and the regulatory agencies both conduct research and regulate activities under their purview. Even so, Sarah did note that all these agencies do a bit of each type of policy (e.g. science diplomacy at NIH Fogarty International Center). In addition, different components of each agency have different roles. For example, individual Institutes focus more on analyzing policy for their core mission (aging at NIA, cancer at NCI, etc.), while the OSP makes policies that influence all corners of the NIH.

Sarah then described her personal duties at OSP’s Office of Scientific Management and Reporting (OSMR):
  • Coordinating NIH’s response to a directive from the President’s Office of Science and Technology Policy related to scientific collections (think preserved specimens and the like)
  • Managing the placement of AAAS S&T Fellows at NIH
  • Supporting the Scientific Management Review Board, which advises the NIH Director
  • Preparing for NIH’s appropriations hearings and responding to Congressional follow-ups
  • “Whatever fires needs to be put out”
If this sounds like the kind of job for you, Sarah recommends building a professional network and developing your communication skills ASAP (perhaps by blogging!?). This sentiment was shared by all of the panelists, and it echoes advice from our previous speakers. Sarah also strongly recommends volunteering for university or professional society committees. These bodies work as deliberative teams and are therefore good preparation for the style of government work.

For more information, check out the OSP’s website and blog. If you’re interested in any of the other speakers from this panel, I refer you to the Biomedical Postdoc Program.

Event Recap: Dr. Sarah Martin, ASBMB Science Policy Fellow

by Ian McLaughlin

On February 11th, Dr. Sarah Martin, a Science Policy Fellow at the American Society for Biochemistry and Molecular Biology (ASBMB), visited Penn to chat about her experience working in science policy. As it turns out, her story is perhaps more circuitous than one might expect.

An avid equestrian, Sarah earned a bachelor’s degree in animal sciences and a master’s degree in animal nutrition at the University of Kentucky before embarking on a Ph.D. in Molecular and Cellular Biochemistry at UK’s College of Medicine. While pursuing her degrees, Sarah realized that the tenure track was not for her, and she began exploring career options using the Individual Development Plan (IDP) provided by AAAS Careers. At the top of the list: science policy.

With an exciting career option in mind, Sarah sought ways to build “translatable skills” during her Ph.D. to help her move toward science policy. She served as treasurer, and later Vice President, of UK’s Graduate Student Congress and developed her communication skills by starting her own blog and participating in ThreeMinute Thesis.  Sarah stressed the importance of communicating with non-scientists, and she highlighted how her practice paid off during Kentucky’s first-ever State Capitol Hill Day, an event that showcases Kentucky-focused scientific research to that state’s legislators.

Sarah also shared  how she got up to speed on science policy issues, becoming a “student of policy” by voraciously reading The Hill, RollCall, Politico, ScienceInsider, and ASBMB’s own PolicyBlotter. Additionally, she started to engage with peers, non-scientists, and legislators on Twitter, noting how it’s a useful tool to sample common opinions on issues related to science.  Finally, she reached out to former ASBMB fellows for advice on how to pursue a career in science policy – and they were happy to help.

Sarah then described the typical responsibilities of an ASBMB fellow, breaking them down into four categories:
  1. Research- tracking new legislation, and a daily diet of articles regarding new developments in science and policy
  2. Meetings- with legislators on Capitol Hill, staff at the NIH, partner organizations such as the Federation of American Societies for Experimental Biology (FASEB), and others
  3. Writing- white papers, position statements, and blog posts on everything from ASBMB’s position on gene editing to the NIH Strategic Plan for FY 2016-2020
  4. Administration- organizing and preparing for meetings, composing executive summaries, and helping to plan and organize ASBMB’s Hill Day.

Sarah also talked about her own independent project at ASBMB, a core component of each Fellowship experience. Sarah aims to update ASBMB’s Advocacy Toolkit in order to consolidate all of the resources a scientist might need to engage in successful science advocacy.

Comparing the ASBMB fellowship to similar fellowships, she noted as an advantage that there is no specific end to the fellowship, which gives Fellows plenty of time to find permanent positions that match their interests.  Sarah also noted that, compared to graduate students and postdocs, she enjoys an excellent work/life balance.

Ultimately, Sarah made it clear that she loves what she does. She closed by providing the following resources from ASBMB Science Policy Analyst Chris Pickett for anyone interested in applying for the ASBMB fellowship or pursuing a career in science policy:

Ready to Adapt: Experts Discuss Philadelphia Epidemic Preparedness

by Jamie DeNizio and Hannah Shoenhard


In early November, public health experts from a variety of organizations gathered on Penn’s campus to discuss Philadelphia’s communication strategies and preparation efforts in the event of an epidemic outbreak. In light of recent crises, such as H1N1 and Ebola in the US, AAAS Emerging Leaders in Science and Society (ELISS) fellows and the Penn Science Policy Group (PSPG) hosted local experts at both a public panel discussion and a focus group meeting to understand the systems currently in place and develop ideas about what more can be done.
Are we prepared?: Communication with the public
Dr. Max King, moderator of the public forum, set the tone for both events with a Benjamin Franklin quote: “By failing to prepare, you are preparing to fail.” Measures taken before a crisis begins can make or break the success of a public health response. In particular, in the age of the sensationalized, 24-hour news cycle, the only way for public health professionals to get the correct message to the public is to establish themselves as trustworthy sources of information in the community ahead of time.
For reaching the general population, the advent of social media has been game-changing. As an example, James Garrow, Director of Digital Public Health for the Philadelphia Department of Public Health, described Philadelphia’s use of its Facebook page to rapidly disseminate information during the H1N1 flu outbreak. The city was able to provide detailed information while interacting with and answering questions directly from members of the public in real time, a considerable advantage over traditional TV or print news.

However, Garrow was quick to note that “mass media still draws a ton of eyeballs,” and that any public health outreach program would be remiss to neglect traditional media such as TV, radio, and newspapers. At this point, social media is a complement to, but not a replacement for, other forms of media engagement.
Furthermore, those typically at greater risk during an epidemic are often unable to interact with social media channels due to economic disadvantage, age, or a language barrier. In Philadelphia, 21.5% of the population speaks a language other than English at home. Meanwhile, 12.5% of the population is over the age of 65 (U.S. Census Bureau). The focus group meeting specifically discussed how to reach these underserved groups. Some suggestions included having “block captains” or registries. “Block captains” would be Philadelphia citizens from a particular block or neighborhood that would be responsible for communicating important information to residents in their designated section. In addition to these methods of monitoring individuals, there was general agreement that there is a need for translation-friendly, culturally-relevant public health messages.

For example, during the open forum, Giang T. Nguyen, leader of the Penn Asian Health Initiative and Senior Fellow of the Penn Center for Public Health Initiatives, emphasized the importance of building ties with “ethnic media”: small publications or radio channels that primarily cater to immigrant communities in their own languages. He noted that, in the past, lack of direct contact between government public health organizations and non-English-speaking communities has led to the spread of misinformation in these communities.

On the other hand, Philadelphia has also successfully engaged immigrant communities in the recent past. For example, Garrow pointed to Philadelphia’s outreach in the Liberian immigrant community during the Ebola outbreak as a success story. When the outbreak began, the health department had already built strong ties with the Liberian community, to the point where the community actively asked the health department to hold a town hall meeting, rather than the reverse. This anecdote demonstrates the importance of establishing trust and building ties before a crisis emerges.
With regards to both general and community-targeted communication, the experts agreed that lack of funding is a major barrier to solving current problems. At the expert meeting, it was suggested that communication-specific grants, rather than larger grants with a certain percentage allotted for communication, might be one way of ameliorating this problem.
Are we prepared?: Communication between health organizations

The need for established communications networks extends beyond those for communicating directly with individuals. It is crucial for the local health department and healthcare system to have a strong relationship. Here in Philadelphia, the health department has a longstanding relationship with Penn Medicine, as well as other universities and major employers. In case of an emergency, these institutions are prepared to distribute vaccines or other medicines. Furthermore, mechanisms for distribution of vaccines already in place are “road-tested” every year during flu season. As an example, Penn vaccinated 2,500 students and faculty for the flu in eight hours during a recent vaccination drive, allowing personnel to sharpen their skills and identify any areas that need improvement.

In addition to the strong connections between major Philadelphia institutions, there is also a need for smaller health centers and community centers to be kept in the loop. These small providers serve as trusted intermediaries between large public health organizations and the public. According to the experts, these relationships are already in place. For example, during the recent Ebola crisis, the CDC set up a hotline for practitioners to call if one of their patients returned from an Ebola-stricken country with worrying symptoms. “You can’t expect everyone in the entire health system to know all they need to know [about treating a potential Ebola case],” said Nguyen, “but you can at least ensure that every practice manager and medical director knows the phone number to call.”

Can we adapt?
Ultimately, no crisis situation is fully predictable. Therefore, what matters most for responders is not merely having the proper protocols, resources, and avenues of communication in place, but also the ability to adjust their reaction to a crisis situation as it evolves. As Penn behavioral economics and health policy expert Mitesh Patel pointed out at the end of the open forum, “It’s not are we ready?, it’s are we ready to adapt?
The topic of adaptability was also heavily discussed at the focus group meeting. A lack of a central communication source was identified as a potential barrier to adaptability. So was a slow response from agencies further up the chain of command, such as the CDC. However, experts also disagreed about the precise degree of control the CDC should have at a local level. For example, representatives from local government agencies, which are more directly accountable to the CDC, expressed a desire for the CDC to proactively implement strategies, instead of attempting to direct the local response once it has already begun. Many physicians and hospital representatives, on the other hand, were of the opinion that plans formulated by the people closest to the crisis may be superior due to their situational specificity and lack of red tape. Despite this point of contention, experts agreed that there is a need for some consensus and coordination between hospitals in a particular region on how to respond to a large-scale health event.

One gap in Philadelphia’s preparedness identified by the experts in the focus group is its ability to case manage novel diseases—a challenge, since often the transmission route of novel diseases is not known. Some experts in the meeting also expressed doubt that Philadelphia is prepared for a direct biological attack. However, numerous epidemic-response frameworks already in place could potentially be repurposed for novel or deliberately-spread pathogens. In these cases, even more so than in “typical” epidemic situations, the experts identified adaptability as a key factor for success.

At the end of the open forum, the panelists affirmed the belief that Philadelphia is as prepared as it can be for an infectious disease crisis.  Furthermore, it seemed they had also moved the opinions of the event’s attendees: before the forum, attendees rated Philadelphia’s readiness at an average of 3.1 on a 6-point scale (with 0 being “not at all ready” and 6 being “completely ready”), while afterwards, the same attendees rated Philadelphia’s readiness at an average of 3.9 on the same scale (p=0.07, paired t-test).

Training the biomedical workforce - a discussion of postdoc inflation


By Ian McLaughlin


Earlier this month, postdocs and graduate students from several fields met to candidly discuss the challenges postdocs are encountering while pursuing careers in academic research.  The meeting began with an enumeration of these challenges, discussing the different elements contributing to the mounting obstacles preventing postdocs from attaining faculty positions – such as the scarcity of faculty positions and ballooning number of rising postdocs, funding mechanisms and cuts, the sub-optimal relationship between publications and the quality of science, and the inaccurate conception of what exactly a postdoctoral position should entail.


From [15]

At a fundamental level, there’s a surplus of rising doctoral students whose progression outpaces the availability of faculty positions at institutions capable of hosting the research they intended to perform [10,15].  While 65% of PhDs attain postdocs, only 15-20% of postdocs attain tenure-track faculty positions [1].  This translates to significant extensions of postdoctoral positions, with the intentions of bolstering credentials and generating more publications to increase their appeal to hiring institutions.  Despite this increased time, postdocs often do not benefit from continued teaching experiences, and are also unable to attend classes to cultivate professional development.


From [10]
Additionally, there may never be an adequate position available. Instead of providing the training and mentorship necessary to generate exceptional scientists, postdoctoral positions have become “holding tanks” for many PhD holders unable to transition into permanent positions [5,11], resulting in considerably lower compensation relative to alternative careers 5 years after attaining a PhD.

From [13]

Perhaps this wouldn’t be quite so problematic if the compensation of the primary workhorse of basic biomedical research in the US was better.  In 2014, the US National Academies called for an increase of the starting postdoc salary of $42,840 to $50,000 – as well as a 5-year limit on the length of postdocs [1].  While the salary increase would certainly help, institutions like NYU, the University of California system, and UNC Chapel Hill have explored term limits.  Unfortunately, a frequent outcome of term limits was the promotion of postdocs to superficial positions that simply confer a new title, but are effectively extended postdocs. 

Given the time commitment required to attain a PhD, and the expanding durations of postdocs, several of the meeting’s attendees identified a particularly painful interference with their ability to start a family.  Despite excelling in challenging academic fields at top institutions, and dedicating professionally productive years to their work, several postdocs stated that they don’t foresee the financial capacity to start a family before fertility challenges render the effort prohibitively difficult.

However, administrators of the NIH have suggested this apparent disparity between the number of rising postdocs and available positions is not a significant problem, despite having no apparent data to back up their position. As Polka et al. wrote earlier this year, NIH administrators don’t have data quantifying the total numbers of postdocs in the country at their disposal – calling into question whether they are prepared to address this fundamental problem [5].

A possible approach to mitigate this lack of opportunity would be to integrate permanent “superdoc” positions for talented postdocs who don’t have ambitions to start their own labs, but have technical skills needed to advance basic research.  The National Cancer Institute (NCI) has proposed a grant program to cover salaries between $75,000-$100,000 for between 50-60 of such positions [1,2], which might be expanded to cover the salaries of more scientists.  Additionally, a majority of the postdocs attending the meeting voiced their desire for more comprehensive career guidance.  In particular, while they are aware that PhD holders are viable candidates for jobs outside of academia – the career trajectory out of academia remains opaque to them.

This situation stands in stark contrast to the misconception that the US suffers from a shortage of STEM graduates.  While the careers of postdocs stall due to a scarcity of faculty positions, the President’s Council of Advisors on Science and Technology announced a goal of one million STEM trainees in 2012 [3], despite the fact that only 11% of students graduating with bachelor’s degrees in science end up in fields related to science [4] due in part, perhaps, to an inflated sense of job security.  While the numbers of grad students and postdocs have increased almost two-fold, the proliferation of permanent research positions hasn’t been commensurate [5]. So, while making science a priority is certainly prudent – the point of tension is not necessarily a shortage of students engaging the fields, but rather a paucity of research positions available to them once they’ve attained graduate degrees. 

Suggested Solutions

Ultimately, if the career prospects for academic researchers in the US don't change, increasing numbers of PhD students will leave basic science research in favor of alternatives that offer better compensation and career trajectories – or leave the country for international opportunities.  At the heart of the problem is a fundamental imbalance between the funding available for basic academic research and the growing community of scientists in the U.S [9,14], and a dysfunctional career pipeline in biomedical research [9].  Some ideas of strategies to confront this problem included the following suggestions.

Federal grant-awarding agencies need to collect accurate data on the yearly numbers of postdoctoral positions available.  This way, career counselors, potential students, rising PhD students, and the institutions themselves will have a better grasp of the apparent scarcity of academic research opportunities.

As the US National Academies have suggested, the postdoc salary ought to be increased.  One possible strategy would be to increase the prevalence of “superdoc”-type positions creating a viable career alternative for talented researchers who wish to support a family but not secure the funding needed to open their own labs.  Additionally, if institutions at which postdocs receive federal funding were to consider them employees with all associated benefits, rather than trainees, rising scientists might better avoid career stagnation and an inability to support families [11].

As the number of rising PhDs currently outpaces the availability of permanent faculty positions, one strategy may be to limit the number of PhD positions available at each institution to prevent continued escalation of postdocs without viable faculty positions to which they might apply.  One attendee noted that this could immediately halt the growth of PhDs with bleak career prospects.

Several attendees brought up the problems many postdocs encounter in particularly large labs, which tend to receive disproportionately high grant funding.  Postdocs in such labs feel pressure to generate useful data to ensure they can compete with their peers, while neglecting other elements of their professional development and personal life. As well, the current system funnels funding to labs that can guarantee positive results, favoring conservative rather than potentially paradigm-shifting proposals – translating to reduced funding for new investigators [9]. Grant awarding agencies’ evaluations of grant proposals might integrate considerations of the sizes of labs with the goal of fostering progress in smaller labs. Additionally, efforts like Cold Spring Harbor Laboratory’s bioRχiv might be more widely used to pre-register research projects so that postdocs are aware of the efforts of their peers – enabling them to focus on innovation when appropriate.

While increased funding for basic science research would help to avoid the loss of talented scientists, and private sources may help to compensate for fickle federal funds [6], some attendees of the meeting suggested that the current mechanisms by which facilities and administrations costs are funded might be restructured. These costs, also called “indirect costs” - which cover expenditures associated with running research facilities, and not specific projects - might be restructured to avoid over 50 cents of every federally allocated dollar going to the institution itself, rather than the researchers of the projects that grants fund [7,8].  This dynamic has been suggested to foster the growth of institutions rather than investment in researchers, and optimizing this component of research funding might reveal opportunities to better support the careers of rising scientists [9,12]

Additionally, if the state of federal funding could be more predictable, dramatic fluctuations of the numbers of faculty positions and rising scientists might not result in such disparities [9].  For example, if appropriations legislation consistently adhered to 5 year funding plans, dynamics in biomedical research might avoid unexpected deficits of opportunities.


From [5]

Career counselors ought to provide accurate descriptions of how competitive a search for permanent faculty positions can be to their students, so they don’t enter a field with a misconceived sense of security.  Quotes from a survey conducted by Polka et al. reveal a substantial disparity between expectations and outcomes in academic careers, and adequate guidance might help avoid such circumstances.

As shown in the NSF’s Indicators report from 2014, the most rapidly growing reason postdocs identify as their rationale for beginning their projects is “other employment not available” – suggesting that a PhD in fields associated with biomedical sciences currently translates to limited opportunities. Even successful scientists and talented postdocs have become progressively more pessimistic about their career prospects.  Accordingly - while there are several possible solutions to this problem - if some remedial action isn’t taken, biomedical research in the U.S. may stagnate and suffer in upcoming coming years.

Citations
 1.    Alberts B, Kirschner MW, Tilghman S, Varmus H. Rescuing US biomedical research from its systemic flaws. Proc Natl Acad Sci U S A. 2014 Apr 22;111(16):5773-7. doi: 10.1073/pnas.1404402111. Epub 2014 Apr 14.
 2.    http://news.sciencemag.org/biology/2015/03/cancer-institute-plans-new-award-staff-scientists
 3.    https://www.whitehouse.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.pdf
 4.    http://www.nationalreview.com/article/378334/what-stem-shortage-steven-camarota
 5.    Polka JK, Krukenberg KA, McDowell GS. A call for transparency in tracking student and postdoc career outcomes. Mol Biol Cell. 2015 Apr 15;26(8):1413-5. doi: 10.1091/mbc.E14-10-1432.
 6.    http://sciencephilanthropyalliance.org/about.html
 7.    http://datahound.scientopia.org/2014/05/10/indirect-cost-rate-survey/
 8.    Ledford H. Indirect costs: keeping the lights on. Nature. 2014 Nov 20;515(7527):326-9. doi: 10.1038/515326a. Erratum in: Nature. 2015 Jan 8;517(7533):131
 9.    Alberts B, Kirschner MW, Tilghman S, Varmus H. Rescuing US biomedical research from its systemic flaws. Proc Natl Acad Sci U S A. 2014 Apr 22;111(16):5773-7. doi: 10.1073/pnas.1404402111. Epub 2014 Apr 14.
 10.    National Science Foundation (2014) National Science and Engineering Indicators (National Science Foundation, Washington, DC).
 11.    Bourne HR. A fair deal for PhD students and postdocs. Elife. 2013 Oct 1;2:e01139. doi: 10.7554/eLife.01139.
 12.    Bourne HR. The writing on the wall. Elife. 2013 Mar 26;2:e00642. doi: 10.7554/eLife.00642.
 13.    Powell K. The future of the postdoc. Nature. 2015 Apr 9;520(7546):144-7. doi: 10.1038/520144a.
 14.    Fix the PhD. Nature. 2011 Apr 21;472(7343):259-60. doi: 10.1038/472259b.
 15.   Schillebeeckx M, Maricque B, Lewis C. The missing piece to changing the university culture. Nat Biotechnol. 2013 Oct;31(10):938-41. doi: 10.1038/nbt.2706.

AAAS Forum Take #2

Another point of view of the AAAS Forum by Matthew Facciani:

I have provided scientific testimony and met with some of my local legislators, but I’ve never had any formal exposure to science policy. I was really excited to hear about the AAAS Science & Technology Policy Forum to learn more about how scientists can impact policy. The information I absorbed at the conference was overwhelming, but incredibly stimulating. Some of the lectures discussed the budget cuts and the depressing barriers for achieving science policy. However, I felt there was definitely an atmosphere of optimism at the conference and it was focused on how we can create positive change.

One of my favorite aspects of the conference were the discussions of how to effectively communicate science to non-scientists. Before we can even have discussions of funding, the general public needs to understand how science works and why basic science is so important. For example, science never proves anything with 100% certainty, but it may sound weak if politicians are only saying that science “suggests” instead of “proves.” One creative way to circumvent this problem is to use comparisons. Instead of saying “science suggests GMOs are safe” we could say “scientists are as sure that GMOs are safe as they are sure that smoking is bad for your health.” The conference was rife with these kinds of effective tactics and I left the conference with a sense of confidence that we can collectively make a difference to influence science policy.


Matthew Facciani is a sociology PhD student at The University of South Carolina. He is also a gender equality activist and science communicator. Learn more at www.matthewfacciani.com, and follow him at @MatthewFacciani.

Asking for a Small Piece of the Nation’s Pie

By Rosalind Mott, PhD


This article was originally published in the Penn Biomed Postdoctoral Council Newsletter (Spring 2015).

Historically, the NIH has received straightforward bipartisan support; in particular, the doubling of the NIH budget from FY98-03 led to a rapid growth in university based research. Unfortunately, ever since 2003, inflation has been slowly eating away at the doubling effort (Figure 1). There seems little hope for recovery other than the brief restoration in 2009 by the American Recovery and Reinvestment Act (ARRA). Making matters worse, Congress now has an abysmal record of moving policy through as bipartisan fighting dominates the Hill.

Fig 1: The slow erosion of the NIH budget over the past decade
(figure adapted from: http://fas.org/sgp/crs/misc/R43341.pdf)
Currently, support directed to the NIH is a mere 0.79% of federal discretionary spending. The bulk of this funding goes directly to extramural research, providing salaries for over 300,000 scientists across 2500 universities.  As the majority of biomedical researchers rely on government funding, it behooves these unique constituents to rally for sustainable support from Congress. Along with other scientists across the country who are becoming more politically involved, the Penn Science Policy Group  arranged for a Congressional Visit Day (CVD) in which a small group of post doctoral researchers and graduate students visited Capitol Hill on March 18th to remind the House and Senate that scientific research is a cornerstone to the US economy and to alert them to the impact of the erosion on young researchers. 

Led by post-docs Shaun O’Brien and Caleph Wilson, the group partnered with the National Science Policy Group (NSPG), a coalition of young scientists across the nation, to make over 60 visits to Congressional staff. NSPG leaders from other parts of the country, Alison Leaf (UCSF) and Sam Brinton (Third Way, Wash. DC), arranged for a productive experience in which newcomers to the Hill trained for their meetings.  The Science Coalition (TSC) provided advice on how to effectively communicate with politicians: keep the message clear and simple, provide them with evidence of how science positively impacts society and the economy, and tell personal stories of how budget-cuts are affecting your research. TSC pointed out the undeniable fact that face to face meetings with Congress are the most effective way to communicate our needs as scientists. With the announcement of President Obama’s FY16 budget request in February, the House and Senate are in the midst of the appropriations season, so it was no better time to remind them of just how important the funding mechanism is.

Meeting with the offices of Pennsylvania senators, Pat Toomey and Bob Casey, and representatives, Glenn Thompson and Chaka Fattah were key goals, but the meetings were extended to reach out to the states where the young scientists were born and raised – everywhere from Delaware to California. Each meeting was fifteen to twenty minutes of rapid discussion of the importance of federally funded basic research. At the end of the day, bipartisan support for the NIH was found to exist at the government’s core, but the hotly debated topic of how to fund the system has stalled its growth.
Shaun O’Brien recaps a disappointing experience in basic requests made to Senator Toomey. Sen. Toomey has slowly shifted his stance to be more supportive of the NIH, so meeting with his office was an important step in reaching the Republicans:

We mentioned the "Dear Colleague" letter by Sen. Bob Casey (D-PA) and Sen. Richard Burr (R-NC) that is asking budget appropriators to "give strong financial support for the NIH in the FY2016 budget". Sen. Toomey didn't sign onto it last year, especially as that letter asked for an increase in NIH funding to $31-32 billion and would have violated the sequester caps-which Sen. Toomey paints as a necessary evil to keep Washington spending in check. I asked the staffer for his thoughts on this year's letter, especially as it has no specific dollar figure and Sen. Toomey has stated his support for basic science research. The staffer said he would pass it along to Sen. Toomey and let him know about this letter.

Unfortunately, three weeks later, Sen. Toomey missed an opportunity to show his "newfound" support for science research as he declined to sign a letter that essentially supports the mission of the NIH.  I plan to call his office and see if I can get an explanation for why he failed to support this letter, especially as I thought it wouldn't have any political liability for him to sign.

Working with Congressman Chaka Fattah balanced the disappointment from Toomey with a spark of optimism. Rep. Fattah, a strong science supporter and member of the House Appropriations Committee, encourages scientists to implement twitter (tweet @chakafattah) to keep him posted on recent success stories and breakthroughs; these bits of information are useful tools in arguing the importance of basic research to other politicians.

Keeping those lines of communication strong is the most valuable role that we can play away from the lab.  Walking through the Russell Senate Office building, a glimpse of John McCain waiting for the elevator made the day surreal, removed from the normalcy of another day at the bench. The reality though is that our future as productive scientists is gravely dependent upon public opinion and in turn, government support. The simple act of outreach to the public and politicians is a common duty for all scientists alike whether it be through trips to the Hill or simple dinner conversations with our non-scientist friends.


Participants represented either their professional society and/or the National Science Policy Group, independent from their university affiliations. Support for the training and experience was provided by both the American Academy of Arts & Sciences (Cambridge, MA) and the American Association for the Advancement of Science (AAAS of Washington, DC).

Dr. Sarah Cavanaugh discusses biomedical research in her talk, "Homo sapiens: the ideal animal model"

Biology and preclinical medicine rely heavily upon research in animal models such as rodents, dogs, and chimps. But how translatable are the findings from these animal models to humans? And what alternative systems are being developed to provide more applicable results while reducing the number of research animals?
Image courtesy of PCRM


Last Thursday, PSPG invited Dr. Sarah Cavanaugh from the Physicians Committee for Responsible Medicine to discuss these issues. In her talk entitled, “Homo sapiens: the ideal animal model,” she emphasized that we are not particularly good at translating results from animal models into human patients. Data from the FDA says that 90% of drugs that perform well in animal studies fail when tested in clinical trials.  It may seem obvious, but it is important to point out that the biology of mice is not identical to human biology. Scientific publications have demonstrated important dissimilarities in regards to the pathology of inflammation, diabetes, cancer, Alzheimer’s, and heart disease.

All scientists understand that model systems have limitations, yet they have played an integral role in shaping our understanding of biology. But is it possible to avoid using experimental models entirely and just study human biology?

The ethics of studying biology in people are different from those of studying biology in animals.  The “do no harm” code of medical ethics dictates that we can’t perform experiments that have no conceivable benefit for the patient, so unnecessarily invasive procedures can not be undertaken just to obtain data. This limitation restricts the relative amount of information we can obtain about human biology as compared to animal biology.  Regardless, medical researchers do uncover important findings from human populations. Dr. Cavanaugh points out that studies of risk factors (both genetic and environmental) and biomarkers are important for understanding diseases, and non-invasive brain-imaging has increased our understanding of neurodegenerative diseases like Alzheimer’s.

Yet these are all correlative measures. They show that factor X correlates with a higher risk of a certain disease. But in order to develop effective therapies, we need to understand cause and effect relationships - in other words, the mechanism. To uncover mechanisms researchers need to be able to perturb the system and measure physiological changes or observe how a disease progresses. Performing these studies in humans is often hard, impossible, or unethical. For that reason, researchers turn to model systems in order to properly control experimental variables to understand biological mechanisms. We have learned a great deal about biology from animal models, but moving forward, can we develop models that better reflect human biology and pathology?

Using human post-mortem samples and stem cell lines is one way to avoid species differences between animals and humans, but studying isolated cells in culture does not reflect the complex systems-level biology of a living organism. To tackle this problem, researchers have started designing ways to model 3D human organs in vitro, such as the brain-on-a-chip system. Researchers also have envisioned using chips to model a functioning body using 10 interconnected tissues representing organs such as the heart, lungs, skin, kidneys, and liver.
Image from: http://nanoscience.ucf.edu/hickman/bodyonachip.php

Dr. Cavanaugh explained that toxicology is currently a field where chip-based screening shows promise. It makes sense that organs-on-a-chip technology could be useful for screening drug compounds before testing in animals. Chip-screening could filter out many molecules with toxic effects, thus reducing the number of compounds that are tested in animals before being investigated clinically.

A major counterpoint raised during the discussion was whether replacing animal models with human organs on a chip was simply replacing one imperfect, contrived model with another. Every model has limitations, so outside of directly testing therapeutics in humans, it is unlikely that we will be able to create a system that perfectly reflects the biological response in patients. The question then becomes, which models are more accurate? While ample data shows the limitations of animal models, very little is available showing that alternatives to animal-free models perform better than existing animal models. Dr Cavanaugh argues, however, that there is an opportunity to develop these models instead of continuing to pursue research in flawed animal models. “I don’t advocate that we end all animal research right now, rather that we invest in finding alternatives to replace the use of animals with technologies that are more relevant to human biology.”

This topic can ignite a passionate debate within the medical research community. Animal models are the status quo in research, and they are the gatekeepers in bench-to-bedside translation of scientific discoveries into therapeutics. In the absence of any shift in ethical standards for research, replacing animal models with alternatives will require mountains of strong data demonstrating better predictive performance. The incentives exist, though. Drug companies spend roughly $2.6 billion to gain market approval for a new prescription drug. Taking a drug into human trials and watching it fail is a huge waste of money. If researchers could develop new models for testing drugs that were more reliable than animal models at predicting efficacy in humans, it’s safe to say that Big Pharma would be interested. Very interested.


-Mike Allegrezza

"Wistar rat" by Janet Stephens via Wikimedia Commons 

At the interface of science and society - a career fostering public interest in science at The Franklin Institute.

Credit: The Franklin Institute
Everybody loves science museums. Their fun and interactive way of presenting science reconnects you with your childhood self, when you were curious, when you wondered, and when you were so amazed that you could only manage to say, “Wow!” But what is it like to work at a science museum?

On Wednesday, we hosted Jayatri Das, PhD, to describe her career engaging the public with science as the Chief Bioscientist at The Franklin Institute. As you would expect, her transition from the lab into the museum was cultivated by a strong interest in outreach and teaching. After receiving her PhD from Princeton, she gained experience as a Christine Mirzayan Science and Technology Policy Fellow developing programs for the Marian Koshland Science Museum in Washington, DC. Following a short post-doctoral appointment, she landed a position with The Franklin Institute, an opportunity that she partly ascribes to fortuitous timing, as PhD level positions at museums are rare.

In her job she embraces a new paradigm for how science should interact with society. The goal is no longer public understanding of science. Rather, she urges we should strive for public engagement with science. “We want to communicate to our visitors that they are part of the conversation on how we use science and technology,” she says.

Science and technology do not exist in a void. Jayatri describes that:

1) Values shape technology
2) Technology affects social relationships
3) Technologies work because they are part of systems.

As an example, consider nanotechnology. This field has opened new possibilities to create quantum computing, high-tech military clothing, flexible inexpensive solar panels, clean energy, simple water filters, and new cancer treatments; even invisibility cloaks and elevators into space have been envisioned. But which of these technologies are developed will depend on the values of those funding the research and the circumstances driving market demand for them. Priorities would be different for a wealthy businesswoman in Japan, a US-trained Iraqi solider, a European who lost a spouse to cancer, and a cotton farmer in India.

As she points out, “Investments [in R&D] are being made by people with values different than most of the world’s population.” Therefore, it is important to challenge people to think globally.

Why are science museums a great place for these conversations? First, they provide trusted and stimulating information. Second, they are a place where people can reflect on science, technology, and the world. And third, they are a place for conversation because many visitors attend in groups.

Part of her job involves designing the many ways that The Franklin Institute engages the public with science, which in addition to interactive exhibits includes public programs, digital media, and partnerships with schools and communities. For instance, she recently led a public discussion about concussions in sports. The all-ages audience was presented with the neuroscience of head trauma and testimony from former Eagles’ linebacker Jeremiah Trotter, and then they discussed what age kids should be allowed to play tackle football.

Because science and technology are so integrated into our lives now, conversations like these are crucial. In order for breakthroughs to be beneficial for society, they have to interact with public attitudes and values. This communication between science and society occurs naturally at science museums, so they offer fulfilling positions for people like Jayatri who are motivated to connect the frontiers of science with casual visitors. 


Interested in volunteering? You can find information here

Bioethics/Policy Discussion - Storage and Weaponization of Biological Agents (Biosafety)

This summer has seen a surge in discussion over biosafety. Should we still be storing smallpox? Is the risk of bioterrorism greater now in the post-genomic era? Should we artificially increase virulence in the lab to be prepared for it possibly occurring in the environment?

On Tuesday the Penn Science Policy Group discussed the issue of biosafety as it relates to potential uses of biological weapons and risks of accidental release of pathogens from research labs.

The idea of using biological weapons has existed long before cells, viruses, and bacteria were discovered. Around 1,500 B.C.E the Hittites in Asia Minor were deliberately sending diseased victims into enemy lands. In 1972, an international treaty known as the Biological Weapons Convention officially banned the possession and development of biological weapons, but that has not ended bioterrorist attacks. In 1982 a cult in Oregon tried to rig a local election by poisoning voters with salmonella. Anthrax has been released multiple times - in Tokyo in 1993 by a religious group, and in 2001 it was mailed to US congressional members. And recently, an Italian police investigation claims the existence of a major criminal organization run by scientists, veterinarians, and government officials that attempted to spread avian influenza to create a market for a vaccine they illegally produced and sold.
Graphic by Rebecca Rivard

Possibilities for bioterrorism are now being compounded by advances in biological knowledge and the ease of digital information sharing. Which begs the question: should we regulate dual-use research, defined as research that could be used for beneficial or malicious ends? Only in the last few years have funding agencies officially screened proposals for potential dual-use research. After two research groups reported studies in 2012 that enhanced the transmissibility of H5N1 viruses, the US Department of Health and Human Services created a policy for screening dual-use research proposals. These proposals have special requirements, including that researchers submit manuscripts for review prior to publication.

We debated whether censorship of publication was the appropriate measure for dual-use researchers. Some people wondered how the inability to publish research findings would affect researchers’ careers. Ideas were proposed that regulations on dual-use research be set far in advance of publication to avoid a huge waste of time and resources. For instance, scientists wishing to work on research deemed too dangerous to publish should be given the chance to consent to censorship before being funded for the study.

In addition to concerns of bioterrorism, public warnings have been issued over accidental escape of pathogens from research labs, fueled by recent incidents this past summer involving smallpox, anthrax, and influenza.  Some caution that the risks of laboratory escape outweigh the benefits gained from the research.  Scientists Marc Lipsitch and Alison Galvani calculate that ten US labs working with dangerous pathogens for ten years run a 20% chance of a laboratory acquired infection, a situation that could possibly create an outbreak. On July 14, a group of academics called the Cambridge Working Group release a consensus statement that “experiments involving the creation of potential pandemic pathogens should be curtailed until there has been a quantitative, objective and credible assessment of the risks, potential benefits, and opportunities for risk mitigation, as well as comparison against safer experimental approaches.”

In defense of the research is the group Scientists for Science, which contends, “biomedical research on potentially dangerous pathogens can be performed safely and is essential for a comprehensive understanding of microbial disease pathogenesis, prevention and treatment.

Our discussion over this research also demonstrated the divide. Some pointed out that science is inherently unpredictable, so estimating the possible benefits of research is difficult. In other words, the best way to learn about highly pathogenic viruses is to study them directly. Another person mentioned they heard the argument that studying viruses in ferrets (as was done with the controversial influenza experiments in 2012) is safe because those virus strains don’t infect humans. However, Nicholas Evans, a Penn Bioethicist and member of the Cambridge Working Group, said he argued in a recent paper that claiming research on ferrets is safer because it doesn’t infect humans also implies that it has limited scientific merit because it is not relevant to human disease.

There seems to be agreement that research on potentially dangerous and dual-use agents should be looked at more closely than it has been. The debate really centers on how much oversight and what restrictions are placed on research and publications. With both Scientists for Science and the Cambridge Working Group accruing signatures on their statements, it is clear that the middle ground has yet to be found.


PSPG hits the streets to explain how genes make us who we are

           With help from the University of Pennsylvania and GAPSA, PSPG was able to run a volunteer booth at the Philly Science Carnival on May 3rd. The carnival was part of the annual 9-day Philly Science Festival which provides informal science educational experiences throughout Philadelphia’s many neighborhoods. The title of the PSPG exhibit was “Who owns your genes?” and featured activities for children and adults alike to educate visitors about how genes make us who we are, what we can and cannot learn from personalized genomics services like 23andMe, and how several biotech companies have attempted to patent specific genes.
           Kids learned how genes act as the instructions for building an organism by drawing alleles for different traits out of a hat and using the genotype to decide how to put together a “monster.” In so doing, they were exposed to the basic principles of genetics (dominant vs. recessive alleles, complete vs. incomplete dominance, and codominance), and they got to leave with a cute pipe-cleaner monster too.
           For our older visitors we presented actual results from a 23andMe single nucleotide polymorphism (SNP) report generously provided by one of our own members (advocacy coordinator Mike Convente). This part of the exhibit walked visitors through the process of sequencing for SNPs, which are single nucleotide bases that vary widely between individuals and can give hints about ancestry, physical traits and possibly diseases, and the implications of bringing these types of tests to the general public. Right now the Food and Drug Administration is trying to figure out how to regulate services like these, which provide genetic information directly to consumers without a qualified middle-man (such as a doctor or geneticist) to explain the complicated results.
           Our exhibit also featured a section entitled “How Myriad Genetics Almost Owned Your Genes” which highlighted the recent Supreme Court case brought against a biotech company that wished to patent two genes (BRCA1 and BRCA2) involved in the development of breast cancer. The genes were discovered at the University of Utah in a lab run by Mark Skolnick, who subsequently founded Myriad Genetics. Myriad went on to develop a high-throughput sequencing assay to test patients for breast cancer susceptibility and eventually obtained patents for both genes. This was controversial for several reasons: 1. These genes exist in nature in every human being and are not an invention; 2. The genes were originally discovered with public funding; and 3. Myriad had a monopoly on testing for BRCA mutations and prevented universities and hospitals from offering the tests. Last year in Association for Molecular Pathology v. Myriad Genetics, several medical associations, doctors and patients sued Myriad to challenge the patents and the Supreme Court decided that patenting naturally occurring genes is unconstitutional (however synthetically-made complementary DNA is still eligible for patenting). It is likely that the patenting of DNA sequences will continue to be an issue in the future considering recent advances in the field of synthetic biology.

Genetically-modified food is not going to give you cancer



Last week PSPG and the Penn Biotech Group hosted Dr. Val Giddings, President and CEO of the consulting firm PrometheusAB and Senior Science Policy Fellow at the Information Technology and Innovation Foundation. Dr. Giddings specializes in issues concerning genetically-modified organisms (GMOs) or as he prefers to call them, “biotech-improved” organisms, which have been genetically engineered to have certain beneficial traits. This usually means that a gene from one organism is inserted into the genome of a different organism to alter its properties or behavior in some beneficial way. GMO crops are frequently altered to improve tolerance to herbicides (think RoundUp) and resistance to insects and pathogens. They can also be modified to change their agronomic qualities (how/when they grow) which helps farmers to be more productive. Crops can also be modified to improve their quality: for example Golden Rice has been engineered to produce beta-carotene, the precursor to vitamin A, which is an essential nutrient that many children in developing countries don’t get enough of1,2.  GMO crops are quite prevalent within the US agriculture, with over 90% of soybeans, 80% of cotton and 75% of corn crops in the US being genetically modified in some way3. Outside of the US, GMO crops are grown in 27 countries by 18 million farmers, most of whom are smallholders in developing countries4. So what are the consequences of all these genetic modifications in our food supply?


The Pros: GMO crops with improved agronomic properties have allowed farmers to increase yield on less land, reducing CO2 output and allowing more wild habitats to remain untouched. GMOs have also decreased the need for pesticides because insect-resistant plants fight off pests on their own, which is good for the environment and good for you. GMO crops that have been modified to increase yield and produce essential nutrients could be a boon for developing countries where hunger and vitamin deficiencies are a serious problem.

The Cons: GMOs could lead to the overuse of herbicides like RoundUp because herbicide-tolerant plants can be sprayed more often with more chemicals. However, Dr. Giddings argued that herbicide-tolerant crops would have to be treated less often because the weeds could be killed off quickly in one fell swoop so there may be a trade-off there. I think the most serious concerns about GMO crops primarily relate to unintended ecological consequences. GMO crops, if they somehow escaped the farm and started growing wild, might out-compete other plants and reduce overall biodiversity. They could also seriously disrupt the food chain, especially considering that they can kill off insect species which are undoubtedly a food source for other animals. And then there’s the question of whether GMOs are safe to eat. There are concerns that GMO crops which produce foreign proteins (such as those that kill off insects) might trigger allergic reactions in some individuals; however there have never been any legitimate reports of this happening. There are also concerns that GM foods could cause cancer; however rigorous, peer-reviewed scientific studies have effectively ruled out this scenario. In fact the most prominent study to claim a link between GMOs and cancer had to be retracted because the sample size was too small to draw any conclusions and the authors used a rat strain which was known to have a high frequency of cancer to begin with5. The bottom line is that there is no evidence that GMOs are bad for you and the Food and Drug Administration has concluded that GMOs are safe to eat6.

Existing federal law requires food labels to be accurate, informative and not misleading. Nutrition labels must contain material information relating to health, safety and nutrition. The fact of the matter is that GMOs are considered safe so there is no reason for the FDA to force companies to identify their products as GMO. Basically the FDA decided that genetically modified foods are subject to the same labeling rules as any other food. Here are the highlights from FDA’s recommendations on how to label GM food7:


  •  If a bioengineered food is significantly different from its traditional counterpart such that the common or usual name no longer adequately describes the new food, the name must be changed to describe the difference.

  • If a bioengineered food has a significantly different nutritional property, its label must reflect the difference.

  •  If a new food includes an allergen that consumers would not expect to be present based on the name of the food, the presence of that allergen must be disclosed on the label.


However this doesn’t mean consumers are completely in the dark about what they’re buying. If you wish to avoid GM foods you can buy food labeled “USDA Organic” or “Non-GMO certified.” Otherwise it’s probably safe to assume a product includes some kind of bioengineered ingredient.
So why the big fuss over GMOs? It’s pretty clear that bioengineered foods are safe to eat and GMOs are probably more helpful than hurtful to the environment. Dr. Giddings offered a few explanations for the widespread resistance to GMOs in the Western world. Firstly, the organic/health food industry reaps big profits by distinguishing itself as a healthy, safe alternative to the Big Ag.  Secondly and more importantly, food is a huge part of every human being’s life and nobody likes the idea of it being messed with in ways they might not understand. It’s especially disconcerting when huge tentacle-y corporations are responsible for these changes.  So with all these considerations in mind, it’s up to you, the consumer, to decide whether genetically modified foods are worth the risks. Feel free to comment if there are any issues I omitted that you think are worth noting.

-Nicole Aiello 


1. Ye X, Al-Babili S, Klöti A, Zhang J, Lucca P, Beyer P, Potrykus I (2000) Engineering the provitamin A (β-carotene) biosynthetic pathway into (carotenoid-free) rice endosperm. Science 287:303-305.
2. Grune T, Lietz G, Palou A, Ross AC, Stahl W, Tang G, Thurnham D, Yin S, Biesalski HK (2010) β-Carotene is an important vitamin A source for humans. Journal of Nutrition doi: 10.3945/jn.109.119024.
3. US Department of Agriculture (USDA). Economic Research Service (ERS) 2013. Adoption of Genetically Engineered Crops in the US data product.
4. Clive James, ISAA Brief 46.
5. Séralini, Gilles-Eric; Clair, Emilie; Mesnage, Robin; Gress, Steeve; Defarge, Nicolas; Malatesta, Manuela; Hennequin, Didier; De Vendômois, Joël Spiroux (2012). "Long term toxicity of a Roundup herbicide and a Roundup-tolerant genetically modified maize". Food and Chemical Toxicology 50 (11): 4221–31.

The unintended impact of impact factors



Dr. Mickey Marks of UPenn stopped by PSPG yesterday to discuss the San Francisco Declaration on Research Assessment (DORA) which calls for new metrics to determine the value of scientific contributions. The system in question is the Thomson Reuters’ Impact Factor (IF) which was developed in the 1970s to help libraries decide which journals to curate. Since then IF has taken on an inflated level of importance that can even influence promotional and hiring decisions.  But can a single number really summarize the value of a scientific publication?
IF is calculated by dividing the average number of citations by the number of citable articles a journal has published over the last two years. One reason Dr. Marks became involved in DORA is because he is co-editor at a journal whose IF had been steadily dropping over the last few years, a trend experienced by numerous other cell biology journals. This led many in the field to question whether IF was really accurate and useful. As you might imagine there are many factors that can skew IF one way or another: for example, in some fields papers are slower to catch on and might not start accumulating citations until well past the two year IF has been calculated. Journal editors can game the system by reducing the number of “citable” articles they publish: citable articles must be a certain length, so if a journal publishes many short articles they can decrease the denominator and inflate their IF. So how reliable is the IF system? Are journals with a high IF really presenting the best science? A few years ago the editors at one journal (Infection and Immunity) set out to address that very question and the answer may (or may not) surprise you. The editors found a strong correlation between IF and retractions (see graph).

Infect. Immun. October 2011 vol. 79 no. 10 3855-3859
          Why are these high impact journals forced to retract at such a higher rate? It might be because these editors are looking for sexy science (because that’s what sells) and might be willing to overlook sloppy research conduct to print an exciting story. Another reason may be that researchers are under extreme pressure to publish in these journals and are willing to omit inconsistent data and allow mistakes and even misconduct slip through to keep the story neat. And this brings us to the real problem presented by IF: in some circles individual researchers’ scientific contributions are being judged almost entirely on what journals they publish in. Scientists learn very early in their careers that if you want a faculty job you need to publish in Science, Nature and Cell. This is because it is faster and easier to draw conclusions based on a journal’s reputation rather than actually read an applicant’s publications.
There are a few alternatives to IF, including Eigenfactor and SCImago which are similar to IF but take into account the overall impact of each journal, and Google’s PageRank which ranks journals based on search engine results. These alternatives generally result in similar rankings to IF, however. The real issue isn’t the rankings themselves but how we as scientists use them. If the system is going to change it will have to start with us.  Scientists must decide together to de-emphasize impact factors and publication rankings when making decisions about promotions, hirings and grants.

Nicole Aiello
PSPG Communications

Bioethics/Policy Discussion: 23andMe vs. FDA

On Wednesday, members of the Penn Science Policy Group met to discuss the current regulatory friction between the FDA and the genetic testing company 23andMe.

23andMe provides personalized DNA results that are interpreted to provide information about ancestry and health risks for various diseases. Because the results provided by 23andMe border on medical information, the FDA has been working closely with the company since 2009 to ensure that their marketing and analysis were accurate and in accordance with federal regulations. However, in May of 2013 23andMe ceased communications with the FDA and simultaneously ramped up marketing of their Personal Genome Service (PGS) for providing “health reports on 254 diseases and conditions.” In retaliation, the FDA sent a letter to 23andMe on Nov 22 warning them to stop marketing their PGS without approval or face harsh regulatory actions. This letter sparked a public debate about how much regulation should be imposed on this new technology, which was the focus of our discussion on Wednesday.

The question at the heart of the debate is: do individuals have the right to access their own genetic information (and interpretations of it) without medical (and hence FDA) oversight?



The laissez-faire argument is that individuals have the right to know their genetics, and governmental paternalism should not restrict that access over fears that certain individuals might react negatively to their DNA interpretations. For instance, a woman might find out she is slightly predisposed to breast cancer and get an unnecessary radical mastectomy (for which it might be challenging to find a doctor that would perform this procedure solely based on the 23andMe PGS, anyways). These scenarios shouldn’t prevent more responsible people from obtaining it. To their credit, 23andMe is very open about allowing people to access more information regarding their personalized interpretations. Curious users can find detailed explanations and links to scientific studies if they care to investigate.

Those in favor of regulation argue that the accuracy of the PGS needs to be verified. They caution that faulty tests could cause unnecessary worry and lead to increased healthcare expenditure to verify or refute the result. The FDA states that most of the intended uses for PGS are consistent with uses that regularly require approval by the FDA. The rationale in favor of regulation is that a company marketing its genetic test as health information should be required to demonstrate the information is correct and reliable. More debates arise from this point about what is correct and reliable and who should determine it. 23andMe insists that the FDA needs to set clear guidelines on this technology before it is feasible for them to comply.

That, in essence, was the purpose of the ongoing communication between the FDA and 23andMe. To many people, it appeared strange that 23andMe all of a sudden ceased talks with the FDA. Some have hypothesized that it was a marketing ploy, and one that seems to have worked thanks to extensive media coverage. While it is too soon to determine their motive for severing communication, currently 23andMe has resumed talks with the FDA and halted their health-related genetic reports (ancestry is still available). The CEO Anne Wojcicki commented that their “goal is to work cooperatively with the FDA... to make sure consumers have direct access to health information in the near future.”

It will be interesting to watch this process unfold. In the future personal genetic tests will become more popular as our knowledge of human genetic variation and disease grows, so the regulation of 23andMe will be crucial to set the standard for other companies offering this service. 

What do you think about the FDA regulating 23andMe? Comments are welcome.

For additional reading, see:







Image: © Rodolfoclix | Dreamstime Stock Photos & Stock Free Images

Richard Calderone and Science Advocacy at Georgetown

Dr. Richard Calderone stopped by PSPG last week to talk about the masters program in Science Policy and Advocacy at Georgetown University. Dr. Calderone is a microbiologist with an active laboratory but he also advises lawmakers on public health issues, especially those involving infectious diseases. A few years ago Dr. Calderone started the science policy masters program which was modeled after an undergraduate certificate program in policy that already existed at Georgetown. Students in this interdisciplinary masters program take courses not only on government and policy but also science classes such as microbiology, immunology and pharmacology. About 40% of graduates are currently in policy positions at places like the EPA and Research!America while many others go on to professional school (medical, dental, law). So if you're thinking about a career in science policy you might want to check out the program!

Dr. Calderone (from Georgetown University Faculty website)

Recap: Adam Katz of Research!America- How (and why) to engage Congress as a research scientist

Adam is the Policy and Advocacy Specialist at Research!America, where he leads a variety of advocacy initiatives to make science and medical research a higher national priority. When he visited PSPG on Nov. 12th he spoke about how and why scientists, especially those in academia, should engage in the political system. Academic scientists in particular are heavily supported by federal funding and taxpayers, so it is important to initiate and maintain a dialog between researchers, politicians and the American people. Research!America has conducted numerous polls to understand the relationship between these three groups, and they have found that while Americans believe scientific research should be a top priority, the public does not have a clear understanding of how this research is funded (only a small fraction of those polled identified the NIH as the main source of basic biomedical research funding). Therefore it is important that we as scientists, taxpayers and constituents take it upon ourselves to stress to our politicians the importance of the NIH and its integral role in supporting basic science research. There are many ways to make your voice heard: through email, phone calls and even in person, as PSPG has done twice this year. Mr. Katz gave excellent tips on how to address a congressperson and their staff: thank them for past support, keep it concise, lay out your concerns and follow it with a personal story, ask their opinion and reiterate the action you'd like taken. If this type of science policy work sounds like a potential career path to you, we encourage you to attend our next speaker event on Wednesday Dec. 4th at 12pm featuring Dr. Richard Calderone, director of the M.S. program in Biomedical Science Policy & Advocacy at Georgetown University.


An informal discussion with Dr. Paul Offit: recap

Dr. Paul Offit stopped by PSPG to lead a discussion on snake oil and pseudoscience and how scientists can combat misinformation. An excellent example of this issue is the ubiquitous use of dietary supplements to treat everything from colds to weight loss to depression. Dietary supplements are not regulated by the FDA and are not rigorously tested for safety and efficacy and yet millions of Americans believe that multi-vitamins and supplements keep them healthy. These daily supplement capsules are packed with more vitamins than a person could possibly get from a normal diet, and yet there is no evidence that more is better. Because supplements are unregulated, what you see on the bottle is not necessarily what you get: there have been instances in which the supplement was actually 30x more concentrated than what the label claimed, and there have been cases of contaminated supplements causing death. So why is the general public so easily fooled? Because they believe they're ingesting "natural" alternative homeopathic remedies, rather than drugs which could have side effects, when in reality these supplements are drugs themselves made by the very pharmaceutical companies people are sticking it to. How can we as scientists combat this false narrative? Dr. Offit suggests to first use evidence-based science, and if all else fails appeal to emotion. In an ideal world, facts, evidence and reason would be enough to convince people, but unfortunately that is not the world we live in.

Dr. Offit is the author of numerous books his latest being Deadly Choices: How the Anti-Vaccine Movement Threatens Us All and Do You Believe in Magic?: The Sense and Nonsense of Alternative Medicine.

Check out Dr. Offit on the Colbert Report in 2011!

Invited Speaker Dr. Harvey Rubin on "A Proposal for a Global Governance System for Infectious Diseases”

Michael Allegrezza

It’s easy to forget about infectious disease when one has access to quality healthcare that includes vaccines and antibiotics for most major pathogens. But infections still account for 22% of all deaths worldwide. In the developing world, it is far worse. For instance, over half of all deaths in sub-Saharan Africa are from infectious diseases. While many researchers are using science and technology to combat this problem, others have noticed that creating international policies for monitoring and controlling infectious disease would also greatly decrease mortality and minimize global outbreaks. 

In June, Dr. Harvey Rubin gave a talk entitled “A Proposal for a Global Governance System for Infectious Diseases” to members of the Penn Science Policy Group. In addition running a research lab, Dr. Rubin has established himself as a critical intellect on the topic of global disease and acts as the Director of Penn's Institute for Strategic Threat Analysis and Response. His publications and testimonies to various bureaucracies offer credibility to his opinions, but it is the real-world application of his insight into vaccine distribution that gives him the most authority.

Vaccines are an enormously effective way to prevent infectious disease. Unfortunately, access to vaccines is limited in certain areas of the globe, specifically places that don’t have a power grid, as vaccines need to be kept refrigerated. Powering the refrigerator requires access to electricity, which is unreliable in remote or under-developed areas. Dr. Rubin’s solution: use the excess energy generated by cell-phone towers, which basically cover the globe (as he puts it, “there are more cell phones in the world than there are toilet bowls”), to power vaccine refrigerators. This simple solution could save 5 million lives annually. After working with local governments, non-profits, and commercial organizations, Dr. Rubin and his team have established a pilot program in Zimbabwe and hope to begin another soon in India. You can check it out here: http://www.energizethechain.org/

Dr. Rubin explained that there is also a challenge ahead to monitor and control global outbreaks of disease.  Even with the recent scares of influenza and SARS outbreaks, there are still no international treaties related to infectious disease. Problems such as the rise in antibiotic resistant pathogens, unsafe and poorly secured containment facilities, political instability, and minimal drug development incentives are creating a climate for global disease outbreak. But these problems present opportunity: each problem has policy-oriented solutions that could be used to minimize the risk of such an outbreak. Towards that end, in 2009 Dr. Rubin published a proposal in Current Science detailing four interconnected components where policy could help shape global disease monitoring and control, which he discussed with us during his presentation.

To wrap up the discussion, Dr. Rubin shared sage advice for those interested in making a career out of this work: “Don’t do what I did.” The demands of academic research will quell this type of branching out until tenure is obtained. He mentioned that good places to develop a career in international health policy can be found outside academia in government and NGOs. Most importantly though, “be an expert in something; then people will believe that you know what you are talking about.”