Breaking bottlenecks in drug discovery and development

Scientists reveal early-stage strategies for bringing safer, more effective drugs to market faster

Cambridge Healthtech Institute hosted a complimentary webinar for Elsevier on September 15 in which industry experts in safety pharmacology and toxicology shared current thinking on drug-development bottlenecks and translational challenges, and innovative strategies that are being investigated to help overcome them. The opinions in this article are those of the individual speakers and not of the organizations.

Despite years of effort and advances in technology and data mining, bringing a drug from bench to bedside remains a costly, complex and time-consuming process with no guarantee of success. According to a recent report by Pharmaceutical Research and Manufacturers of America, “Biopharmaceutical Research & Development: The Process behind new Medicines,” it takes on average at least 10 years for a drug to make the journey from discovery to the marketplace at an average cost of $2.6 billion. And the likelihood that a drug entering clinical testing will eventually be approved is estimated to be less than 12 percent.

Researchers have begun testing ways to overcome bottlenecks in drug discovery and development, thereby streamlining the process involved in selecting a candidate drug, developing it, getting it into the clinic and then to the public. They’re becoming more proactive about identifying and investigating potential adverse effects before the clinical trial phase;  using stem cells and in silico (computer) modeling to gain insights earlier into how a promising compound might react in the human body; and changing how they select and work with animal models, thereby reducing the amount of animal testing required for a new drug approval.

In a recent symposium for pharmaceutical industry thought leaders, “Preclinical Safety Strategies to Impact Early Decision-Making,” experts in pharmacokinetics/pharmacodynamics, safety pharmacology and toxicology explained how these approaches are enabling drug developers to predict sooner and with more accuracy whether a candidate drug has a chance of success.

Identifying unwanted drug side effects sooner

A valuable but often overlooked strategy is the implementation of secondary pharmacology in the earliest stages of drug discovery, according to Dr. Laszlo Urban, Global Head of Preclinical Secondary Pharmacology at Novartis Institutes for Biomedical Research in Cambridge, UK. Once researchers determine by laboratory testing that a particular compound or candidate drug effectively acts on a therapeutic target (e.g., an enzyme or receptor implicated in a disease). they move on to secondary pharmacology — or not, Dr. Urban noted.  Secondary pharmacology studies evaluate potential “off-target” or unintentional effects of a promising compound. But to speed up the discovery/development process, researchers often skip that step, moving right into safety pharmacology studies — mainly animal testing — instead.

But if off-target effects are picked up earlier, chemistry and data mining techniques often can be used to modify the compound, removing the off-target effects while maintaining the therapeutic effects, Dr. Urban explained. “If we don’t do that, we have to face managing those off-target effects in a later phase, at which point they may translate into clinical adverse reactions.”

For example, it is now known that 5-HT2B (serotonin) receptor agonists such as the weigh-loss drug fenfluramine-phentermine and the recreational drug MDMA (also known as ecstasy) can trigger potentially irreversible cardiac valve disease (fibrosis). That problem led to a scandal in France surrounding the diabetes drug Mediator. The drug was widely prescribed off-label as an appetite suppressant to women who wanted to lose weight. Its use has since been linked to thousands of deaths and injuries.

When researchers investigate potential therapeutic compounds, the implementation of secondary pharmacology could pick up on such off-target hazards, Dr. Urban said: “We could determine at that point whether fibrosis manifests in the heart valve when the drug hits the target. If we go to the clinic without doing this, we end up with something like Mediator. If we deal with an off-target effect in the early preclinical stage, we’re in a much better position to develop a compound that doesn’t have this side effect.”

As an example of how mitigation worked during the early drug discovery phase, Dr. Urban pointed to a project in which researchers identified several promising therapeutic compounds, all of which had off-target effects on an enzyme called PDE3. “We watched this effect closely because we know that PDE3 inhibition will increase mortality in congestive heart failure patients,” he explained. Secondary pharmacology revealed that the effect was linked to the concentration of the compound in the blood. “Work was terminated on two of the clinical candidates before they went to the clinic because they had very narrow therapeutic indices and so were more likely to trigger problems. We selected the one with a wide therapeutic index for further development.”

Despite secondary pharmacology’s successes in early safety assessments and its ability to detect “invisible” adverse reactions that could develop only during prolonged treatment, the approach isn’t foolproof. “We have to be aware that a disease mechanism may influence the adverse reaction profile, so a drug that is safe in one patient population may end up not being safe in another,” Dr. Urban cautioned. “And we have to be careful not to overemphasize a potential adverse reaction, which could stop a project in its tracks. However, underestimating a risk is equally dangerous.”

Predicting human safety on a desktop

Safety pharmacology has traditionally emphasized in vivo testing, however; in an effort to reduce costs and shorten drug discovery timelines, there is an increasing need to develop predictive in vitro and in silico models, , according to Dr. Bernard Fermini, Manager, Ion Channel Group, Global Safety Pharmacology, Pfizer, in Groton, Connecticut. “Because clinical attrition remains a concern, despite the current in vivo de-risking strategies, some have called into question the paradigm we’re using to evaluate safety,” he said. “There is also increasing pressure, in the US but especially in Europe, to work toward the European Union’s directive of the ‘three Rs’— replace, reduce and refine the use of animals for scientific purposes.” Taken together, “there’s a demand for more predictive, less expensive and faster tools, which is pushing us back into the in vitro domain.”

Advances in stem cell technologies are helping to propel in vitro research forward, and “depending on how things go in the next few years, they could be important game changers,” Dr. Fermini said. Working with primary cardiomyocytes — heart cells obtained from human tissue — has drawbacks. Among other factors, human cardiac tissue is not readily available, and cells obtained from this tissue don’t keep well in culture. Following shifts in the portfolio of many pharmaceutical companies, an increasing number of in vitro safety tests require chronic exposure to a drug to determine whether it has adverse effects on the heart. Also, “there are no human cell lines available that faithfully reproduce the electrophysiological properties of human cardiac cells,” he noted, so researchers can’t be sure they’re seeing the full safety picture.

By contrast, stem cells are commercially available, express most cardiac ion channels and can be kept in culture for a long time, making them suitable for chronic testing, Dr. Fermini said. Stem cells also can be generated from patient-specific diseases, so they can be used to study heart conditions, such as long QT syndromes and heart failure.

That said, stem cells also have shortcomings. “Not all stem cells are created equal. Different companies provide these cells, and they are not generated, produced, maintained and differentiated in the same way,” Dr. Fermini observed. In addition, certain characteristics of stem cells—for example, a lack of maturity in their ability to cycle calcium—make them unsuitable for certain projects. Vendors and users are discussing ways to overcome these issues, he said, “and once they’re solved, we’ll be in possession of human cardiac cells that faithfully reproduce the properties of human heart cells and can be cultured, made into fibers and work in 2D and 3D culture. We’ll have in vitro material that will be easy to use and that will translate nicely to clinical outcomes. In addition, the whole field of organs-on-chips is moving very quickly, which will give us more opportunities to develop relevant models while reducing our usage of animals for testing.” These options should be available “within the next decade or sooner,” he predicted.

New techniques in in silico modeling also are reducing the need for animal testing and enabling researchers to predict cardiovascular safety “on a desktop, and very cheaply,” Dr. Fermini said. “The advent of automated systems and the ability to generate and better understand large amounts of data are allowing us to create more accurate models.“

The ideal would be to use stem cells together with in silico modeling to generate the most robust models, an approach in keeping with the tenets of the CiPA (Comprehensive in Vitro Proarrhythmia Assay) initiative, of which Dr. Fermini is co-chair of the Ion Channel Working Group and a steering committee member. “If we can get all these tools to work together, it will enable us to accelerate the development of new and safe drugs to fill unmet medical needs and make sure they can reach patients more quickly,” he concluded.

Improving animal studies for science

Dr. Noël Dybdal, Director and Principal Scientist-Pathologist at Genentech in San Francisco, discussed advances in animal studies driven largely by the three Rs. “Refining” the way animals are managed during experiments can translate into greater sensitivity and specificity in cardiac test results, she said. For example, the heart rates of restrained animals who were tested using ECGs were much higher than those of unrestrained animals whose heart rates were reported using telemetry. In addition to being more humane, lack of restraints and use of telemetry led to significantly reduced variability in heart rate data, QT interval data and blood pressure in response to candidate drugs, she said. Similarly, appropriately housing animals in ways that were more in keeping with the animals’ preferences in nature —group housing versus individual housing, for example — reduced stress on the animals and yielded “very robust and highly sensitive data.”

“We need to remember we’re not working with robots or computers, we have living, breathing, physiological systems that have behavioral responses to their experimental paradigms,” Dr. Dybdal said. “If we think about that and choose the right model, and design the best protocol for the specific molecule we’re working with, we will get the lowest variability in our data and increase the sensitivity of our output.”

Dr. Dybdal also pointed to microsampling — taking a number of small blood samples from the main study animals instead of using additional animals to measure adverse reactions — as a way of both reducing the number of animals needed and potentially gaining more accurate data, since the same animals could be tested over time.

Dr. Dybdal concluded by emphasizing the concept of “one health,” which recognizes that the health of humans is connected to the health of animals and the environment. In this context, “as our in vivo tools become more refined, we have the opportunity to do drug development for companion animals, as well, eventually helping not only human medicine but also veterinary medicine.”


Dr. Philip MacLaughlin, Director of Product Development at Elsevier R&D Solutions for Pharma & Life Sciences, closed the webinar by stating that the company is “looking at how best to support pharmaceutical R&D” in light of technological advances, massive amounts of data, and efforts to streamline early-stage safety strategies as well as drug-development overall. The company has embarked on a project that will involve “providing new and different content, with deeper context and experimental design, solving specific use cases and focusing on areas such as biomarker selection and more rapidly drawing correlations between in vitro and animal studies and their associated human correlates.” Results of the project are expected in late 2015/early 2016. To learn more, contact Dr. MacLaughlin.

Elsevier Connect Contributor

Marilynn LarkinMarilynn Larkin (@MarilynnL) is an award-winning science writer and editor who develops content for  medical, scientific and consumer audiences. She was a contributing editor to The Lancet and its affiliated medical journals for more than 10 years and a regular contributor to the New York Academy of Sciences' publications and Reuters Health's professional newsfeed. She also launched and served as editor for of Caring for the Ages, an official publication of the American Medical Directors Association. Larkin's articles also have appeared in Consumer Reports, Vogue, Woman's Dayand many other consumer publications, and she is the author of five consumer health books.

As a consultant on postural awareness and confidence building, Larkin has presented to corporations and nonprofits and at regional and national meetings of, among others, the American Society on Aging and National Council on Aging, the American College of Sports Medicine, and New Jersey Dietetic Association.

comments powered by Disqus

Related Stories