NEW MOLECULAR TECHNOLOGIES WILL YIELD TARGETED CANCER THERAPIES AND FUEL ADVANCES IN PERSONALIZED MEDICINE:

An Interview with Karol Sikora of Hammersmith Hospital and AstraZeneca

Cancer presents a wealth of opportunity for genomics-based drug development strategies. Genomics is beginning to provide important new targets for "cleaner," less toxic therapies. It is also providing clues about what makes certain cancers resistant. In the future, genomics should also provide new means of early detection, which will also improve outcomes. In this article, Karol Sikora, Professor of Cancer Medicine, Imperial College, Hammersmith Hospital, and Senior Consultant to AstraZeneca Oncology and Medical Solutions PLC, discusses the challenges to learning how to optimize these new, targeted cancer therapies in the clinic. This commentary is excerpted from a new Cambridge Healthtech report, Cancer Genomics: Revolutionizing Treatment and Reshaping Markets through Targeted Therapies.

The pharmaceutical industry is currently experiencing a time of great transition. This transition stems from the fact that molecular biology has produced a whole series of new targets and therefore, a whole series of new drugs.

The difficulty lies in that we do not really know how to use these new drugs. The biggest problem is not the chemistry and getting new molecules. Nor is it conducting clinical trials, because there are great machinations in place to do that. It’s the area in between the development of new drugs and their subsequent application in the clinic. This area has been ignored, both by academic groups and big pharma, for many years. However, it is now receiving a lot of attention.

I believe that it is this critical area that we need to put emphasis on in cancer—more so than in any other therapeutic areas within drug development. Five to ten years in the future, there is going to be a real change in the molecules that will be given to patients. These all currently have NDAs filed, and the FDA will approve them over the next ten years. The difficulty then will be how to optimize their use in the clinic. To do that, we’ll need diagnostic capability that the healthcare delivery systems around the world do not currently possess. Genomics and proteomics will help bring that to bear. There will also be new advances in technology, just like PCR was a huge advance 10–15 years ago. A lot of emphasis is going to have to be placed on academic groups that are dealing with patients, so that it is possible to obtain fresh samples.

Relatively sophisticated molecular biology must be performed using fresh samples before we can decide how to optimize therapy.

I envision that in ten years’ time, there will be lots and lots of different drugs hitting different molecular targets. We’ll know a lot about these molecular targets. Today’s histopathology lab will evolve to a molecular pathologist in cancer, who will be able to tell us which drugs are likely to work in a particular tumor, in a particular patient. We’ll break down the barriers of cancer—instead of thinking squarely in terms of breast, lung, colorectal, or prostate cancer, these will instead become diseases that are more or less likely to respond to different classes of molecular targets or agents.

The specter is daunting to today’s healthcare delivery systems. Specialized providers of a diagnostic service will need to emerge. It is unlikely that big pharma will do this on their own. Because if you look at the beginnings of targeted therapeutics, i.e., Herceptin or humanized antibodies, the diagnostic is produced by a variety of companies outside the big pharma developing the therapeutic. I suspect that this will continue to be the model in the future.

We are already witnessing the emergence of embryonic, molecular pathology companies. These will likely become the diagnostic companies of the future in cancer. First and foremost, therapies will be based on diagnostics. The other thing that will come of it is that new targets and new drugs will be discovered using these same diagnostic technologies. There will be a complete change in the way we think about cancer.

Suppose 100 patients come into the clinic with breast cancer. Each of those 100 patients will require a different regimen. At the moment, it is very crude how treatment is tailored. For example, a relatively young, pre-menopausal patient with poorly differentiated disease and lymph node involvement might get some type of adjuvant chemotherapy, whereas an elderly patient with limited disease and no nodes involved would get tamoxifen only. So the mindset of targeted therapy is already there—segregation of patients is already occurring but in a very crude fashion. As a result, some people are getting too much chemotherapy, some people are getting too little chemotherapy, and some people are probably getting the wrong drugs altogether. But in the future, all the new therapies will be hitting known targets. Once we understand the whole system, we should be able to devise treatments much more effectively.

There is tremendous financial pressure associated with going down this route. Iressa, for example, which targets the epidermal growth factor receptor, showed promising results in early studies. But when it was taken into very expensive clinical studies, Iressa was not shown to make a significant difference in treatment of lung cancer. Clearly what’s happening is that there is a subset of patients that respond to Iressa and a subset of patients that do not. When the big, expensive clinical trials are performed, the responders become lost within the mass of non-responders.

Right now, it is partly a lack of good diagnostic tools that is holding us back from being able to effectively target therapy, and it is partly the fact that a lot of the logistics require you to have fresh, frozen biopsies. The only way to use much of the molecular technology is on samples that have been snap-frozen in a controlled way, which is currently not being done. Traditionally, we have been taking biopsy samples and putting them into formalin. The logistics are just not there yet, but they are coming.

Another obstacle is the way in which big pharma are organized. They are segmented into a discovery division and a clinical division. Due to consolidation within the industry, many of the clinical divisions are on different continents than the discovery divisions. So there is a lack of cohesiveness between the two. The discovery side is judged by how many compounds they can get into the clinic. The clinical side is judged by how fast they can get a drug to the market after it is handed to them. The discovery benchmark is the number of new entities that actually go into the clinic within a given year. The clinical benchmark is based on the timeframe within which FDA approval is obtained for marketing. What is missing is some-thing that allows the input of discovery science into the clinic or even into the early phase of drug development. Different companies are coming out with different solutions—some call it translational medicine, some call it experimental medicine, some call it clinical science, and some call it molecular medicine. But the fact is, it is still essentially missing.

One way around this is that there will be companies emerging that will provide that sort of service. These companies will specialize in the ethical collection of fresh, frozen material and the molecular technology to analyze that material, to produce the results enabling interface between the clinic and discovery programs. The commercial side of pharma is concerned about this type of scenario, because it goes against marketing logic. Marketing logic is that you want to have a blockbuster. And the blockbuster has to have as many uses and be as broadly applicable as possible. What this technology is likely to do is to completely segregate the market. It’s very worrying from the current commercial standpoint. But on the other hand, if there are 100 compounds and five of those 100 are going to be used in an individual patient, providing one pharma has the whole set of five, it is not quite as worrisome. In that case, the total set becomes a blockbuster—not the individual drugs.

There are certain steps on the way to getting to personalized medicine. The first step is obtaining some sort of biomarker telling you the drug is hitting its target. All research at the moment is on biomarkers to accompany the drugs. So rather than the discovery divisions just handing the drug to the clinical side to go into patients, they will also need to hand over a biomarker that tells you the drug is really hitting the targets that it is designed to hit. Biomarker research could even be done using low doses of the drug in healthy volunteers. For example, a lot of anti-hypertensive drugs are put into normal volunteers first.

The second step is that you really need a surrogate endpoint that tells you not only that the drug is hitting its target but also that the tumor is going to shrink—even before it starts shrinking. The importance of the surrogate lies in that many of the new drugs are not actually going to cause dramatic tumor shrinkage—they are just going to hold things status quo. In other words, they are going to control, rather than cure, the disease. And that is a valid drug if it prolongs survival and confers good quality of life. The trouble is that it means if an X-ray is taken two months after giving the new drug, not much change is seen. And the standard way of deciding which drugs to take forward into the Phase II setting is based on those that induce tumor regression. So if you have the biomarker and a surrogate endpoint that tells you the tumor is likely to respond, then you have the confidence to take the drug into large, randomized studies.

The other part of development that is needed is improved functional imaging, which is still very much in its infancy. Very sophisticated structural imaging can be done using MRI and CT—one can image tumors and normal tissue and understand all the anatomy. What we are not so good at doing is being able to understand what is going on within those structures at the molecular level. These techniques are beginning to improve, through positron emission tomography (PET scanning) and various other technologies that allow you to explore that. But these are expensive technologies, and it is also an area akin to the attainment of fresh, frozen biopsies, which pharma finds difficult because patients are needed. These techniques cannot be used in-house in a pharmaceutical research setting—you have to actually be out there in a clinical environment.

A connection between the two environments will be required. Again, a way this may occur is through other, third parties setting up—for example, an academic community setting up a company specifically for functional imaging and providing a service to pharmas in order to help them in developing the drug. That would be one way of doing it. The other way would be for big pharmas to set up an imaging facility within an academic environment. That is beginning to happen—different mechanisms are being used.

The next five years will likely see smaller-scale development take place, much like the HercepTest for Herceptin; the focus will be on more of the direct, single-product relationship between diagnostic and drug. With Iressa, there may be markers discovered that indicate likelihood of response. But after five years, from 2008 to 2013, we will see more complex, additive measures being taken. So it will no longer be just one factor that will be measured. Several factors will be measured, perhaps with the assistance of bioinformatics. Analysis of multiple parameters of patient material may be done to predict the best chemotherapy, or the best type of agent to give, in order to get the best result. From 2013 to 2020, I think we will see the beginning of proper personalized medicine. This will yield a situation wherein there will be

100 agents on the shelf for different molecular targets, and you will be able to predict the best set of agents to give, and maybe even the timing of when to give them.

Finally, I don’t believe that radiotherapy and existing, cytotoxic chemotherapy will fall into disfavor—these therapies will be used to reduce the bulk of the tumor. The new, targeted therapies will be used for long-term control of disease, once it has been reduced by the "old- fashioned" therapies. The problem with classic, cytotoxic chemotherapy is that by 2009, everything is going to be generic and therefore by definition, no longer of interest to big pharma. There are very few, new cytotoxics going through development right now.