Mitigate The Potential For Immunogenicity To De-Risk Your Early Drug Development

Source: Lonza

By Noel Smith Ph.D., Head of Immunology, Lonza


Despite exciting breakthroughs in science and technology over the last couple of decades, the pharmaceutical industry continues to struggle with high drug-attrition rates, with only an estimated 1 in 1,000 preclinical drug candidates actually reaching the market.1 Manufacturers facing an average of 12.8 years and a cost of approximately $2.6 billion to complete their drug development journey must adopt risk mitigation strategies that can move their ideas beyond pharma’s “valley of death,” in order to not only reduce investments in time and resources but also deliver innovative new therapies to the patients who need them.2

While the reasons for clinical failure vary, a large portion are due to issues with safety and efficacy.3 A lack of translatable testing models has historically limited insight during the preclinical development stage on how the human body will react to a drug candidate, but new assays that can more accurately predict immunogenicity potential are now available. Yet, their success is dependent on several factors such as cell quality, culture conditions and assay readout parameters. Understanding what these factors are as well as how and when these tools should be used will help you design the most effective testing strategies for your drug development program, ultimately increasing its likelihood of success and improving speed to market.


Get unlimited access to:

Trend and Thought Leadership Articles
Case Studies & White Papers
Extensive Product Database
Members-Only Premium Content
Welcome Back! Please Log In to Continue. X

Enter your credentials below to log in. Not yet a member of Drug Discovery Online? Subscribe today.

Subscribe to Drug Discovery Online X

Please enter your email address and create a password to access the full content, Or log in to your account to continue.


Subscribe to Drug Discovery Online