Personalized Medicine Category

Rapid advances in the basic sciences, including new fields like genomics and proteomics, are enabling companies to develop medicines that are targeted at specific patient populations based on “companion diagnostics”, like assays used to detect specific genetic mutations present in cancer – like the HER2/neu gene for breast cancer, an aggressive mutation present in approximately 25 percent of breast cancers. Patients who test positive for HER2 can be assigned to therapies like Herceptin, a monoclonal antibody that is designed to block HER2. The combination of targeted therapies and diagnostics enables physicians to target new therapies at the patients who are most likely to respond, as well as avoiding potentially serious side effects. Diagnostic guided therapies can also help to reduce unnecessary or wasteful health care costs. One study of the KRAS gene for metastatic colon cancer found that a genetic test that identified patients with a KRAS mutation were unlikely to respond to powerful but expensive colon cancer drugs like Erbitux. Researchers estimated that if all patients with metastatic colon cancer were screened before treatment, the health care system could save up to $600 million dollars.

For the uninitiated, 3D printing is a fast-growing manufacturing technology that effectively allows "printing" of small objects like machinery components. Where the "revolution" part comes into play is that the "printers" are small enough and inexpensive enough to let almost anyone set up a mini-factory in their garage - or laboratories. These mini-factories are connected to a computer where 3D models are designed and fed into the printer (along with the necessary raw materials). Using high-powered lasers, the printer shapes the object according to the specifications.

Last December I wrote about Organovo - a 3D bioprinting company that was partnering with Autodesk to print living, architecturally correct human tissue. A new, potentially more exciting development is that researchers from the University of Edinburgh have developed a printer that is able to produce living, viable, embryonic stem cells. For those suffering from chronic diseases like Alzheimer's or Multiple Sclerosis - this has the potential to be life-changing.

While most adult tissue has its own stem cells, embryonic stem cells are unique - they are able to differentiate into almost any type of tissue to repair it after it has been damaged. In recent years, however, there's been quite a bit of controversy surrounding the ethics of using embryonic stem cells (which have to be harvested from human embryos), to say nothing of the possibility of rejection when injecting stem cells from one person into another.

While these issues remain, the ability to spit out these stem cells through a simple manufacturing process (the printer doesn't technically manufacture the cells - it clumps them into uniform droplets to keep them viable using two types of "bio-ink") provides a new, automated way of producing embryoid bodies (essentially a clump of stem cells) that can be used in treatments. And indeed, an ever growing body of research is indicating that stem cell treatments - even those derived from a person's own body (non-embryonic) may help to cure (not only treat) chronic diseases like MS.

Of course, any optimism should be tempered with reality. Printing stem cells that have biomarkers indicating pluripotency (the ability of a stem cell to differentiate into any type of cell) is very different from using those same cells to treat diseases in humans. It's unclear whether the human body will be able to accept these manufactured stem cells or how viable they will be in the long-run compared to natural stem cells.

There are also potential regulatory pitfalls. The FDA has been less than accommodating to companies that have tried using autologous stem cell treatments (where stem cells are taken out of a person, treated, and injected back into the same person), shutting down the laboratory of a promising venture in 2012. Though the FDA has a specific statute under which they regulate human cells and tissues, these newly manufactured cells would likely not fall under that statute. Instead, a company would probably need to pursue approval as a drug - but the long, winding, and expensive process of clinical trials is poorly suited to proving the ability of stem cells in treating chronic diseases.

A more stem-cell-friendly approach would allow companies offering these treatments to conduct "N=1" trials - for patients who have decided that an unproven treatment may very well be worth it if it has the possibility of curing a disease like MS - and submit this data over time to the FDA to help prove the efficacy of the treatment as well as to potentially help validate new surrogate endpoints.  

A recent article in NPR resurrects an important, controversial issue for the FDA - stem cell treatments.

Often considered the body's "master cells", stem cells help form and repair damaged tissue in the body. The most potent, embryonic stem cells, can essentially differentiate into any other kind of cell - potentially allowing them to regenerate tissue in the brain, the spine, or anywhere else. As the name implies, however, embryonic stem cells only exist in embryos, raising ethical concerns. Also, stem cells derived from embryos face the prospect of tissue rejection when placed in a new host, potentially requiring recipients to take dangerous immunosuppressive drugs for life. 

Adult stem cells, however, exist in tissue across the body. These cells, however, are more specialized and can only repair specific types of tissue. In 2011, Celltex Therapeutics was formed - the company offered a process that would genetically alter adult cells to have certain properties of embryonic cells. These cells would then be re-injected into the patient from whom they came. The thought was that this could help patients with a variety of chronic diseases - Multiple Sclerosis, Alzheimer's, Amyotrophic lateral sclerosis (Lou Gehrig's Disease) - if not curing the diseases, at least offering a potentially effective treatment where often none exist.

In 2012 the FDA identified over 30 violations at the Celltex facility, most of which pointed in one direction - that Celltex was marketing an unapproved drug. Under traditional FDA regulations, medical procedures (referred to as the "practice of medicine") are largely unregulated beyond basic sanitary standards, along with any regulations the FDA imposes specifically on the associated laboratory tools or devices involved in the transplant. This stance covered the gamut from bone marrow transplants to in-vitro fertilization.

However, the FDA has taken an increasingly aggressive regulatory stance towards physician offered stem cell therapies, based on the more than "minimal manipulation" standards (and indeed, FDA's very broad statutory authority) that would allow regulation of the process under CFR 1271.3(f)(1) (the section of FDA's regulations that deal with the use of human tissue). Under this statue, the FDA determined that Celltex fell short, and thus their process could be regulated as a drug.

While the FDA has shown promise lately by working to develop more adaptive standards for clinical trials, their position on stem cell treatments is rather worrisome. The fact is that stem cell treatments similar to the process that Celltex was offering have already shown promise; proving the efficacy in double-blind randomized clinical trials, however, is extremely difficult.

Many of the diseases that stem cells would help treat are orphan diseases (those with fewer than 200,000 patients in the U.S.), making recruitment for large-scale trials very difficult. And for people suffering from diseases like Alzheimer's, the risk-benefit often falls on the side of taking a chance with what may be unproven treatment.

Instead of forcing companies like Celltex (which is now moving their operations to Mexico!) to comply with narrowly defined trial guidelines, the FDA should make it easier for such treatments to receive approval through an alternative pathway that recognizes the difference between Celltex's process and a cancer drug.

This would require a pathway that is more conducive to Bayesian analysis - for instance, multiple "N=1" trials (where one person decides to try an unproven treatment - like Ms. Wilkinson from NPR's piece) could be used to gather data on fairly simple clinical or surrogate endpoints (in the case of Ms. Wilkinson, 11 out of 25 of her MS symptoms improved!). This data could be used to both validate new surrogate endpoints for future use, and to offer additional data to the FDA as approval is pending. Certainly, shutting down a lab without recognizing that physicians have been traditional innovators in the medical field - and still invent new surgeries without FDA regulation - seems like an extreme step. Surely there was a middle of the road approach that would have recognized the value of the procedure while warning patients that they were highly experimental and collecting additional data that could be validated through other means.

But more importantly, when it comes to dealing with a part of your own body (remember, Celltex was taking stem cells out of person A and putting them back into person A) it seems rather draconian to classify the product in question as a drug.

While the long-term efficacy of stem cell treatments may not be certain, there is much evidence showing at least some short improvements in patients that receive them. For those with otherwise untreatable diseases, this is undoubtedly a gamble worth taking.

As the New York Times noted in its December 15 editorial, When the Physician is Not Needed, health-care reform will propel millions of newly insured Americans into a system with far too few primary-care physicians.

That shortfall will have to be filled by other qualified providers, including nurse practitioners (NPs) operating in retail-based clinics. As I noted in a 2011 New York State Health Foundation-sponsored report, Easy Access, Quality Care: The Role for Retail Clinics in New York, retail clinics offer high-quality care for routine ailments at a fraction of the cost of physician's offices or emergency rooms. Today, about 1,400 retail clinics operate nationwide.

Unfortunately, in New York, retail clinics are often hindered by outdated regulations. Fewer than 20 retail clinics operate in New York today, compared with 106 in Florida (which licenses corporate owned retail clinics, although it does not license provider-owned clinics). Albany policymakers should rescind laws that prohibit companies from directly employing NPs and allow NPs to practice independently - making routine care more affordable and accessible.

This not to say that all traditional hospital or physician-based services are outmoded, just that every provider should be allowed to practice at the top of their license and utilize emerging technologies and new business strategies to reduce costs and increase efficiency. Providers that deliver enhanced quality, improved convenience, and lower costs to consumers - whatever license they happen to have - will win in the emerging cost- and data-driven health care market place.

Take another evolving consumer-oriented health care innovation, telemedicine. Telemedicine (via companies like Teledoc) offers a way for consumers to access physician services using phone or web-based platforms like Skype or Facetime.

Telemedicine allows consumer to conveniently consult with expert physicians or specialists when they can't get an appointment with their regular physician, in rural areas where specialists aren't available, or for business travelers who can't access their regular physicians - avoiding an expensive and onerous trip to the emergency room. In New York, Beth Israel Medical Center offers telemedicine services for just $38 - including prescriptions, if necessary - compared to $75 or $100 for a traditional "bricks and mortar" office visit.

An even more promising evolution is the expansion of app-based health care services through on your smart phone. My colleague Mark Mills has penned a terrific Forbes blog series on the app- and supercomputer-driven future of medicine, enabled by massive increases in computing power and ubiquitous Web access. The real life Star Trek tricorder may not be far off, and it will eventually revolutionize medicine - with sophisticated analytics that can match or better the diagnosing skills of all but the best physicians.


Back to the New York Times editorial. Bravo to them for thinking about health care from the consumer's perspective, where cost, quality, and convenience are key. The irony is that the Affordable Care Act is building out mid-century American medicine - subsidizing high cost traditional insurance and physician access - when technology is poised to make the old paradigms obsolete.

Health care's labor problem won't be solved without massive increases in technology driven productivity, espeically given the double whammy of the Affordable Care Act and a rapidly aging population that is going to require much more care and care management. The resulting logjam of patients demanding doctors to "see them" will have to spur a revolution in how care is delivered, and by whom (or what).

In other words, paging Dr. Watson.

Over at Xconomy, Brian Patrick Quinn of Vertex Pharmaceuticals (the company that developed Kalydeco) discusses the need for a strong American manufacturing sector:

Manufacturing - as many others have argued - is vital to many strong businesses and to all strong societies, even in the 21st century.

Quinn makes a strong case for a resurgence in American manufacturing, making the important distinction that modern-day manufacturing, is not the bleak, Dickensian dystopia that it was in days of yore. And while we hear every day about China's comparative advantage at manufacturing, we tend to forget about our own edge - technology.

Vertex for instance, started construction on a new production plant; one that is more efficient and capital-intensive than the old-fashioned, labor-heavy factories abroad. It allows higher quality drug production at significantly lower cost, and as Quinn rightly notes, having the research lab next door allows new discoveries to be implemented more quickly.

This advantage isn't unique to Vertex either; the entire American pharmaceutical industry is ever more tech-savvy and innovative. The personalized medicine revolution underscores the need for a tight knit relationship between production and research, in order to adapt on the fly - something that factories abroad may have difficulty with. And this is essential in supporting innovation:

Perhaps the most valuable trait of the manufacturing sector is its capacity for supporting innovation. In fact, experience shows that innovation and manufacturing processes are too interdependent to work well when they're separated...

But supporting American manufacturing requires more than just sitting back - reducing regulatory barriers and appropriate push/pull incentives will be critical.  Uncertainty about the FDA's future course makes these investments more risky, and makes it less likely for investors to see a particular company as a 'winner'. After all, to have a manufacturing plant, you need to have something to manufacture.   On this front, having strong and consistent leadership from Commissioner Hamburg is a plus, as is the FDA's willingness to reform clinical trials requirements for antibiotics, accelerate access to "breakthrough therapies", and rationalize regulation in other ways.  While companies can innovate anywhere in the world, assuring rapid access to America's large pharmaceutical market is certainly an attraction in locating R&D and manufacturing facilities in the U.S., close to regulators.

States are doing their part as well.  For instance, Massachusetts, for instance, has funded a 10-year $1 billion life sciences initiative to support the existing biotech cluster in the region. Similar public-private partnerships at the local level can give companies incentive to establish (or maintain) manufacturing plants in the region.

At the federal level, reforming the decrepit American tax system will also make the U.S. a more competitive location for global headquarters (and consequently manufacturing). After all, pharmaceutical manufacturing requires appropriate funding from venture capitalists and other investors who can choose from an international menu of tax regimes.  Our tax system, at a minimum, shouldn't be chasing them away.  

Let's hope Washington gets the memo.

We're just at the dawn of molecular medicine. And it's going to change everything.

Take cancer treatment, for instance. Next generation therapies for cancer - including new molecular-targeted therapies, nanotechnology enhanced chemotherapy, gene therapy, and cancer immunotherapy - have the potential to "disrupt" tradiational cancer treatment paradigms, radically improving outcomes and (in the long run) sharply lowering the costs of treatment.

Researchers and the media have been talking about the revolution in "personalized medicine" for more than a decade now, but that just means that the most promising therapies are just beginning to reach the clinic now, with even more powerful therapies in the pipeline behind them.

As these treatments reach the mainstream, they will make many of our current health care debates obsolete. Every few years, we wring our hands about the cost of new drugs, and ask how pharmaceutical companies can charge so much for treatments that only extend life by a few weeks or months.

Of course, incremental innovations are better than no innovation at all, and new some cancer therapies, like Gleevec, are truly "game changers", and for a handful of other types of cancer (like breast cancer and testicular cancer) survival rates have skyrocketed as companies and researchers have substantially improved both diagnostics and treatments. But for most solid tumors, and some blood cancers, the prognosis is still unremittingly grim and the treatment costs are very high.

That prognosis, however, is likely to change, as both the effectiveness of new treatments rises and their cost plummets as new technologies mature. For instance, the New York Times this week chronicled how researchers at the Children's Hospital of Philadelphia genetically re-engineered leukemia patient Emma Whitehead's own T-cells - using a deactivated version of the virus that causes AIDS, no less - to attack her cancer, acute lymphoblastic leukemia. This was a last ditch experimental treatment, because Emma's cancer had resisted every other treatment her doctors had tried. The Times explains:

To perform the treatment, doctors remove millions of the patient's T-cells - a type of white blood cell - and insert new genes that enable the T-cells to kill cancer cells. The technique employs a disabled form of H.I.V. because it is very good at carrying genetic material into T-cells. The new genes program the T-cells to attack B-cells, a normal part of the immune system that turn malignant in leukemia.

The altered T-cells - called chimeric antigen receptor cells - are then dripped back into the patient's veins, and if all goes well they multiply and start destroying the cancer.

The T-cells home in on a protein called CD-19 that is found on the surface of most B-cells, whether they are healthy or malignant.

What is even more remarkable is that when Emma developed a life-threatening complication from the immunotherapy, her doctors were quickly able to run a battery of diagnostic tests to isolate the specific immune response that was causing the problem (an overproduction of interleukin-6). They then used another drug (off-label, normally used for rheumatoid arthritis) to save her life. The treatment has since been used successfully in several other patients who developed the same complication.

Researchers might have to administer another dose or two of the therapy later, or might not - they can easily track her cancerous B-cells to make sure the disease remains in check. Her genetically altered T-cells, however, will remain in the body, roaming hunter-killers seeking out signs of cancer. (Although, as the Times nnotes, the engineered T-cells attack all of Emma's B-cells, cancerous or not, since they both express the same cell surface protein. However, if researchers can identify a more specific cancer protein signature they can spare the healthy cells by making the engineered T-cells even more precise.)


The work at CHOPs is a stunning advance for cancer immunotherapy and personalized medicine, since the T-cells must be tailored for each patient, rather than brewed in enormous vats, like traditional drugs. The drug company Novartis is backing the commercial development of the technology, so it can eventually be scaled up for far more cancer patients - and eventually applied to other cancers, including solid tumors. (The first successful application of cancer immunotherapy, although it appears to be less successful as a therapeutic, is Provenge for advanced prostate cancer.)

As we suggested earlier, another bright spot in this story is the cost of the modified cell therapy, about $20,000 per patient, according to the Times. Compare that to the cost of chemotherapy. One drug, Clolar, can cost $68,000 for two weeks of treatment for relapsed pediatric ALL. Bone marrow transplants, another ALL treatment option, can cost hundreds of thousands of dollars and long hospital stays.

Another advantage of tailored immunotherapies (like other targeted therapies) is that they can show efficacy rapidly in smaller clinical trials, lowering the cost of development and allowing companies to press FDA regulators for rapid marketing approval in light of the high benefit-risk ratio for patients who've run out of other options. Doctors will then - as Emma's did in her case - fine tune them on the fly as diagnostic and treatment options improve around them.

CHOPs and Novartis are helping to pioneer a completely different model of drug development, and drug approval that can help de-risk the entire industry and enable rapid follow on innovations. While industry is going through the doldrums now, Emma's saga is a welcome sign that the future of the industry - and the science underlying it - is bright.

Of course, for Emma Whitehead and her parents, just having a future to look forward to is enough. The next time you hear someone worry about the cost of new cancer treatments, you might want to mention her story to them.

Paul Howard & Yevgeniy Feyman

Democrats sold -and continue to sell the ACA - as a way to cover the "millions of people" with pre-existing conditions who can't get affordable insurance. For instance, in defending their recently released guaranteed issue regulations, HHS claimed that 129 million Americans have pre-existing conditions.

This is a huge bait and switch. The vast majority of Americans with "pre-existing conditions" already have insurance. Why? Age is strongly correlated with developing a chronic illness - and seniors are covered by Medicare. If you're disabled and poor, and can't work, you're eligible for Medicare and Medicaid. The low-income poor (healthy or not) are already eligible for Medicaid. In between, the majority of Americans have employer-provided insurance, and are also already protected from pre-existing insurance exclusions or rate hikes due to illness, through HIPAA.
Who's left then? Not that many people.

In fact, preliminary results from the Center for Disease Control and Prevention's (CDC) National Health Interview Survey indicate that even among the uninsured, only 1.7 percent considered themselves to be in poor health, compared to 6.8 percent of those in Medicaid, and just .6 percent of those with private insurance.

A Medical Expenditure Panel Survey report from 2007-08 also estimated that only 16 percent of the uninsured had two or more chronic conditions - compared to one-third of those with private insurance and 50 percent for public (Medicare and Medicaid).

In a 2010 National Affairs article, James Capretta and Tom Miller estimate that only 2-4 million uninsured Americans with pre-existing conditions need additional financial help accessing insurance, preferably through high risk pools.

High risk pools allow people with serious pre-existing conditions get affordable coverage without increasing insurance costs for young and healthy uninsured. Yet this is where Obamacare has also failed, despite a modest effort. Under the law, federal high-risk pools were established to provide access to healthcare for patients without insurance, and with pre-existing conditions. A recent evaluation has found that only about 45,000 people signed up for these pools; a fraction of the 375,000 that CMS expected. Reasons proposed for the failure of the pools include low funding (only about $5 billion) and high costs for signing up. Regardless, for the last four years Obamacare has failed to expand healthcare to those with pre-existing conditions who really needed it.

Ironically, Obamacare also attacks consumer driven health plans - which a recent Mercer report credits with helping to hold down health insurance inflation to a 15-year low - threatening to drive up insurance costs just as we're identifying the tools to keep them in check . Various requirements such as the Minimum Loss Ratio (that insurers must spend at least 80 percent of premiums on benefits) and minimum actuarial value (that plans must cover a minimum of 60 percent of expected healthcare costs) make consumer driven health plans - which often have low premiums with high deductibles - less viable.

Ultimately, the biggest flaw with the ACA's insurance market reform is that it enforces expensive insurance regulations on the entire small group and individual insurance markets, increasing the cost of getting insurance for the vast majority of uninsured who are basically in good health. It also scales those subsidies up to 400% of the poverty level, to people who could easily afford to purchase it on their own.

Obamacare's failure at what should have been its primary goals leaves the door open for conservatives to start pushing for reform. The House could pass legislation repealing Obamacare's community rating and guaranteed issue regulations (as our colleague Avik Roy has suggested), and fixing Obamacare's flawed high risk pools. Paring back the subsidies (from 400% to 200% or 300%) would also lower Obamacare's price tag while still helping people who need it the most.

Governors of states that refuse to establish Obamacare's health exchanges (or expand Medicaid coverage) could also push for legislation to allow Medicaid funds to be used to help purchase private insurance for that vast majority of non-disabled or elderly Medicaid enrollees. This would provide high quality private coverage, and prevent people from shuffling between Medicaid and private insurance as their income changed. True state flexibility in Medicaid program design might also convince many governors to re-think their opposition to Obamacare's Medicaid expansion.

The debate on fixing or fighting Obamacare is likely to continue to for years to come. In the meantime, moderates and conservatives should point out that Obamacare's biggest shortcomings are self-inflicted - they didn't have to happen in the first place but can (and should) be remedied.

Among the new catchy tech terms of the past decade is one with many lessons for the future - "Big Data." Essentially, it refers to datasets so large and complex, that processing them requires a huge amount of computing power. More importantly, big data brings new ways to model past trends more accurately and make ever more accurate predictions for the future.

The 2012 election should push big data even more into the spotlight - Obama's campaign team raised $1 billion not by repeating their 2008 approach, which was successful, but still flawed, but by taking a new "measure everything" approach. They tested whether phone calls from a swing state were more effective than calls from a non-swing state, whose emails performed best in which season, and allocated resources based on computer simulations of the election. To make it all work, the campaign hired an analytics team was twice the size of what it was in 2008, with a "chief scientist" who had experience crunching huge data. It was this fine-grained approach to campaigning that helped the President land his second term in office; something the GOP has yet to learn.

Big data also proved useful in predicting the election results - long criticized by both sides of the aisle, Nate Silver, a statistician and political analyst, correctly predicted every single state's votes this election season, and the 2008 election with extreme accuracy. Silver's model eschews the conventional wisdom of the industry that follows day-to-day poll results and instead uses various factors - economic and others - that have influenced election outcomes in the past, to make rational, and surprisingly accurate projections.

But big data isn't unique to political analysis - far from it.  In fact, its application in other fields may be even more intriguing and beneficial.

A natural candidate is the bio-pharmaceutical industry, where a huge drop-off in patents along with increased clinical trial costs have left a void that big data can help to fill.

Recently, GNS Healthcare, a big data analytics company, announced that it is partnering with Mount Sinai School of Medicine to develop a big-data-based computer model of multiple myeloma - a rare bone cancer that can require a bone marrow transplant to treat. Researchers will use the model to help study new potential targets and tailor prospective treatments to each patient.

This approach, which uses computerized models based on clinical data, offers a cost-effective, efficient way forward to developing new treatments for diseases, particularly orphan diseases - those that occur in fewer than 200,000 individuals in the U.S., making typical randomized clinical trials difficult and extraordinarily expensive.

While ten years ago, processing power would have been a roadblock, now only regulatory barriers exist. The FDA has yet to fully embrace the big data revolution, and still relies mostly on outmoded clinical trial guidelines (with the exception of some cancers and HIV drugs). Large-scale clinical trials were suitable for dealing with infectious diseases of the past, but complex chronic diseases - cancers, Alzheimer's, and various neurological ailments, require a new approach. Allowing drug developers to apply big data methods to find molecular targets and match them to patients based on electronic medical records will open a new frontier in drug development (see my colleague Paul Howard's discussion of personalized, individually tailored medicine and its application to oncology).

What are the hurdles here? Standardizing electronic medical records to carry genomic or other biomarker data; getting more patients to agree to allow their data to be "mined", with appropriate privacy protections; and building the databases that will allow practicing oncologists to access the latest research in real time.  Data, after all, is a two way street, flowing up from patients and doctors, but flowing back down again as researchers plumb cancer's complex molecular networks.   

Utilizing "Big Data" would allow drug companies to cut the costs of development and focus on many more prospective drug candidates, as well as weeding out ineffective compounds faster.  And better targeted medicine translates into better value for patients and payers.

Regulatory reform along these lines would also send a broader signal to pharmaceutical companies that the U.S. is committed to remaining at the cutting edge  of pharmaceutical development, ensuring that the associated benefits - first access to novel drugs and sustained domestic R&D - accrue to Americans.

Thanks to Charlie Hooper for drawing attention to a terrific Wall Street Journal article this week on the FDA's compassionate use program, which allows a small fraction of patients (about 1200) battling deadly diseases to receive access to experimental drugs.

The compassionate use program is designed to allow patients with serious or life threatening diseases to access medicines that are still in development, and haven't been approved for marketing by the FDA. In this case, Dr. Nisha Gupta was infected with Hepatitis C, a very nasty viral infection that, until relatively recently, had few good treatment options. Dr. Gupta became very ill, with the disease eventually causing liver cancer, leading to a liver transplant.

Dr. Gupta petitioned several companies and the FDA for access to experimental therapies, with one company, Bristol Myers Squibb (BMS), eventually granting her and her physician access to daclatasvir. (My fellow blogger Josh Bloom has written quite a bit about the next generation of antiviral therapies for Hep C, which represents a true breakthrough in the field). Today, thanks to BMS, Dr. Gupta is doing much better.

But the WSJ article implies that there is some sharp break between "experimental" medicines and FDA approved drugs.

The reality is far more complex. The public has the false impression that after the FDA approves a product, it is 100% safe and 100% effective. This is simply impossible. The enormous complexity of human biology means that no drug is safe and effective for every patient under every conceivable use - let alone in combination with other drugs the patient may be taking.

And yet pressures from Congress and the public to meet this impossible standard have led the FDA to demand ever more safety and clinical trial data from drug manufacturers, making drug R&D ever more expensive and time consuming. Some commentators have gone so far as to call this Moore's law in reverse, or Eroom's law, where productivity is falling across the industry even as costs for new technologies (like genomic sequencing) plummet.

Plunging pharma productivity.png
Source: The Productivity Crisis in Pharmaceutical R&D

The FDA, or at least its senior leadership, has understood this challenge at least since the AIDS epidemic, and has moved to create regulatory "safety valves" that allow some compounds to come to market without the full dossier of safety and efficacy data otherwise required from manufacturers.

Compassionate use is one of those "safety valves" programs. Accelerated approval is another. These programs recognize that patients like Dr. Gupta will die if they don't have access to "experimental" therapies for which there are good - not perfect, but good - reasons to believe that they might help them.

The trouble is that these programs are very small, and (and the case of accelerated approval) have been largely limited to cancer, HIV, and orphan drug indications. It's also no coincidence that these diseases benefit from very well organized, very vocal, and very influential patient's groups. Other diseases - usually slower killers that afflict millions of Americans, aren't so lucky. Think diabetes, obesity, and central nervous systems (CNS) disorders like Alzheimer's.

One study, by the Tufts Center for the Study of Drug Development, found that CNS disorders spend 102 months in clinical review, 40 percent longer than non-CNS drugs.

What do to do about this disparity? And how can we create a better societal approach to balancing the risks and benefits of new medicines that can accelerate innovation?

The evolution of cancer treatment gives us a lens through which to view the future. Many new targeted cancer treatments attack only a small fraction of the true genetic diversity of cancer. Cancer is not one disease, or even one hundred diseases. There may be dozens of different subtypes of even rare cancers like gastrointestinal stromal tumors (GIST), which afflict just a few thousand patients annually.

Take the new, and very effective, drug crizotinib, from Pfizer. Crizotinib is an anaplastic lymphoma kinase (ALK) inhibitor, and the ALK rearrangement is found in about 4% of lung cancers. Now, crizotinib is a great drug for this population, and 4% of a common cancer like lung cancer is still a great many tumors, but what about the other 96%?

We're going to need far more drugs - many, many more drugs - to challenge not just the genetic diversity of cancer but the inevitable development of tumor resistance to targeted therapies. Even relatively "simple" cancers like chronic myelogenous leukemia (CML), with a single driving genetic mutation (BCR-ABL), will eventually develop resistance to powerful targeted drugs.

Gleevec, first approved by the FDA in 2001 may work for CML 5 or 10 years in many patients. But resistance will come. And eventually, patients will develop resistance to follow on versions of Gleevec as well.

So, even with many of the best drugs we have for CML today, in a relatively uncomplicated cancer, what we're really doing is buying time. The trade-off is eminently worth it, because we're talking about years or decades for patients, but we haven't achieved the kind of disease control we have for, say, AIDS.

At the other end of the spectrum, new molecular screening technologies are uncovering prospective targets in cancer much faster than we're coming up with drugs for them.

So the challenge is to put more shots on target, and develop multiple drug cocktails that shut down multiple targets simultaneously and eventually conquer the problem of tumor resistance (again, AIDS is the paradigm here).

Drug companies are very worried about this. Why are they worried? Because the time and cost necessary to bring even targeted therapies to market is still staggering. Even in a banner year - 2011 - the FDA approved only 35 drugs (and not all of them for cancer, obviously). Crizotinib, a miracle drug for some lung cancer patients, took five years to go from lab to patients.

That's blazing speed for any drug, but still far too slow, given the challenge. And think about the need, for a moment.

Not just for Dr. Gupta and a few thousand patients she represents for the compassionate use program, but for the over 500,000 patients who die every year from metastatic cancer; 80,000 from Alzheimer's; and nearly 70,000 from diabetes.

Tweaking the system around the edges - expanding compassionate use, for instance - is not what we need.

What do we need?

Another approach, gaining currency among regulators and drug companies, is the idea of adaptive licensing. Adaptive licensing would allow market access to targeted populations early in the drug testing process, i.e., after basic safety and efficacy testing is completed. Drugs would then be followed in the postmarket environment through electronic medical records.

Adaptive licensing/approval might be followed by ongoing randomized controlled trials to confirm efficacy or uncover rare adverse effects, or outcomes could be validated by using observational methods or targeted diagnostics. In some cases, just comparing the treatment group to the natural history of the disease (for ALS, for instance) might be sufficient for translating adaptive licensing into full approval.

The trade-off, or bargain, is that companies start selling their drugs in very small but targeted patient populations with very high medical need, but then expand the label and indications as data is developed. This would be an iterative digital learning process that breaks down the barriers that currently exist between clinical research and real world patients.

Another way to think of it would be as a "rolling" approach to drug approval where safety and efficacy data would be continually collected in different populations in the "real world" to expand use, restrict use, or withdraw the product.

Obvious hurdles that an adaptive licensing approach would have to overcome would include liability concerns, patent issues (companies will have to be convinced that they FDA won't trap their drug in niche population while the patent life of the drug is eroding), and convincing providers and insurers that adaptive licensing wouldn't foist expensive and unproven medicines on a credulous public.

But I think these are all very tractable problems. The key for success would be shifting from the idea of informed consent, which is now focused patients enrolled in highly selective clinical trials, to an idea of informed choice in a market environment.

Consumers would have to accept more uncertainty in some respects, but it would be in return for greater potential benefits in areas of high unmet medical need.

The FDA's job would also shift from gatekeeper to chief information officer, ensuring that patients and physicians were empowered with the data they need to make smarter choices.

The irony is that we often don't have the information we need to make truly informed choices today. Drugs are tested in highly artificial clinical trials, before they are released into large populations. In fact, the way we test and approve drugs today actually penalizes companies for taking a more stepwise approach to learning about their products, because the patent clock is ticking every minute they are in testing.

In short, the FDA's compassionate use program is a very small tool, for what is a very big problem. We need a new paradigm for thinking about patients and sustaining breakthrough innovations - offering patients less compassion, and better informed choices.

The doctor-patient relationship is unique; almost sacred. Predicated on the Hippocratic Oath of "do no harm," physicians accept on themselves the responsibility for their patients' well-being. On the other side, patients entrust doctors to advise them on all decisions affecting their health. When government - whether state or federal - decides to intrude on this relationship, issuing mandates or laws, there needs to be more than a 'good reason' - there must be greater harm that can come from not interfering. Otherwise, a law may "[impair] the provision of medical care and may ultimately harm the patient," as noted by U.S. Disctrict Judge Marcia G. Cooke.

In a letter to the New England Journal of Medicine (NEJM), leaders of five professional physician's societies express concern over what they see to be a growing trend of state legislation intervening in the doctor-patient relationship. Among the legislation singled out, a New York law that was

enacted in 2010 and became effective in early 2011 [that] requires physicians and other health care practitioners to offer terminally ill patients "information and counseling regarding palliative care and end-of-life options appropriate to the patient, including . . . prognosis, risks and benefits of the various options; and the patient's legal rights to comprehensive pain and symptom management."

The authors note that end of life and palliative care is not a one-size-fits-all area of healthcare; instead, physicians can best determine what discussions are appropriate, and when. The law even imposes criminal penalties for 'willful violation.'

Another law singled out is the controversial Virginia legislation that originally required a woman to receive a transvaginal ultrasound - relatively invasive - before an abortion. While the bill was changed to require a transabdominal ultrasound instead, the authors maintain that the bill is still an unwarranted intrusion into the doctor-patient relationship.

Given the authors' apparently strong views on government interference in the doctor-patient relationship, it seems odd not to address recent developments at the federal level that are arguably more draconian - the Independent Payment Advisory Board (IPAB) established under the Affordable Care Act (ACA); the FDA's REMS policies; and Comparative Effectiveness Research (CER) used to establish a drug's effectiveness (and, eventually, reimbursement).

The Independent Payment Advisory Board

IPAB's goal is to reduce Medicare spending through 'efficiency improvements' such as reductions in reimbursements to physicians and drug companies. While IPAB is prohibited from changing benefits, they can effectively make reimbursement for a particular drug or medical device so low so as to discourage utilization. Effectively, this is rationing. And of course, it will be unelected bureaucrats making these calls - to the detriment of patients, who may have fewer cutting edge drugs and treatments available to them under Medicare. This is particularly troubling in emerging era of personalized medicine, when drugs or drug combinations may only be effective in discrete groups of patients.  One size-fits-all reimbursement is the last thing that patients and innovators need. Is this an intrusion? Without a doubt. 

FDA Risk Evaluation and Mitigation Strategies (REMS)

In order to assure itself that the benefits of a drug outweigh the risks, the FDA has the power to require risk evaluation and mitigation strategies (REMS) from drug manufacturers. Essentially, this means that when providers want to prescribe a certain drug that has a REMS requirement, the manufacturer has to impose certain preconditions - everything from a basic medication guide to including certain tests prior to prescribing it, or limiting the drug to specialist physicians.  Jumping through these hoops may lead to delays in patient access to some medications.  Effectively, this represents an intrusion on the practice of medicine, although the FDA doesn't like to frame it that way.

While ensuring that drugs are prescribed correctly is important, we can't overlook the danger that the FDA may prescribe more REMS to protect itself from the negative publicity and Congressional criticism that attends serious but rare drug side effects.  It's also worrisome that REMS seem to be expanding over time, with regulators increasingly looking over doctor's shoulders. The assumption is that physicians won't reach the same benefit-risk judgments that the agency makes, but they may have good reasons for doing so.     

Both patients' and pharmaceutical industry associations have expressed concern over the FDA's REMS program: pharmaceutical companies have noted that the FDA has no black-and-white criteria for determining when REMS will be required; meanwhile, the National Health Council, an association of health organizations, has called for a Government Accountability Office (GAO) review of the REMS process to determine if it is preventing new drugs from entering the market.

Increasingly, concerns about the abuse of prescription drugs (especially pain medicines and drugs for ADHD) have made it harder for some law-abiding physicians and patients to access drugs for legitimate uses.  Striking the right balance here is crucial, and bears careful watching

Comparative Effectiveness Research

With growing healthcare costs viewed as a substantial problem for state and federal budgets, comparative effectiveness research is increasingly being viewed as an attractive cost control strategy.  As President Obama once quipped, why pay for an expensive red pill when a cheap blue is available that does the same thing? Indeed, the 2009 American Recovery and Reinvestment Act (ARRA) allotted around $1 billion in funding for CER to "support research assessing the comparative effectiveness of health care treatments and strategies."

At the most basic level, CER studies compare two drugs to see which, on average, is more effective. The idea is that if a more expensive drug is as effective as a less expensive alternative, there is no reason to cover the more expensive variant. On the surface this logic may seem intuitive, but it misses a critical point - most people are not the average patient. A drug that looks to help the same 'average' patient may in fact be helping a different segment of the patient population that isn't being helped by another drug. While government sponsored CER can provide important information that markets may not produce (some comparative trials are so large as to be impractical for any single company) , and patients should certainly have better incentives to pursue the most cost effective strategies, it's no panacea - and runs counter to, again, the growing trend of more personalized treatment options.

Controversial social policy issues, like those mentioned in the NEJM article, often garner the most attention.  However, there are broader and potentially much more serious intrusions into the doctor-patient relationship waiting in the wings. 

After all, the more government pays for health care, the more it will want to tell doctors and patients what they're allowed to have - or what they have to go without. 


The first presidential debate between Mitt Romney and President Obama was easily the wonkiest such debate I can recall in my lifetime. That's great for the country. But even better was the fact Mitt Romney was able to correct a number of the misleading statements that President Obama has been making about Romney's plans for health care and entitlement reform. Let's review the details. . .

Continue Reading
Avik Roy,'s Apothecary, October 4, 2012

The pharmaceutical industry, as I have noted before, is an industry in transition, reflecting significant change in the regulatory environment, technology, market expectations, and competitive set. Any one of these dynamics forces, even leading companies, to rethink fundamental assumptions about their business models, their go-to-market strategies and the products and services they bring to market. But for pharmaceutical manufacturers, the fact that all of these forces are happening simultaneously accelerates the pressure and reinforces the need to challenge business model assumptions. As a result, orphan drugs appear to be an interesting opportunity to consider for insuring continued success.

Blockbuster drugs have created massive profits for Big Pharma for quite some time. But as patents have expired over the past few years (and continue to do so), this model appears unsustainable. With the enormous success of these blockbuster drugs, and significant regulatory hurdles for new drugs, most manufacturers have spent R&D money investing in follow-on or me-too products, rather than developing innovative drugs. The result is a shockingly low number of truly new molecules and an equally disappointing ability to identify which drugs will work for which patients under a given set of circumstances. Nowhere is this more evident than in the treatment of cancer. Without change to the business model, there is grave potential for the bubble bursting in an industry that U.S. companies have dominated for years.

Further complicating the issue for the industry is that regulators, payers, and patients now question the benefit of some of these follow-on products compared to their predecessors. Many of the new drugs have not demonstrated significant improvement to justify additional cost, and blockbuster drugs have historically generated significant side effects across various populations. Globally the bar is being raised for what constitutes economic and clinical evidence.

The current model relies almost exclusively on randomized placebo-controlled trials (RCTs) required for bringing them to market. These clinical trials utilize inclusion and exclusion criteria that wind up not properly accounting for the people who actually take the drugs in the real world post approval. And this is how we wind up with uncertain efficacy and a whole variety of negative effects that were not anticipated. This is where orphan drugs have an opportunity to play a potential role in Big Pharma's resurgence.

Orphan drugs target a small subset of the population (fewer than 200,000 people). To the extent one can find narrow indications and unmet medical need, there is an opportunity to capitalize on the advantages conferred by the orphan drug law that enable pharmaceutical and biotechnology companies to zero in narrowly, thus, avoiding some (if not all) of the issues with efficacy and safety mentioned above. Economic and clinical value is easier to demonstrate with a small population, and the pendulum is swinging in favor of narrower focus (niche markets).

Another benefit to orphan drugs is that these markets do have the potential to become "rolling blockbusters" (I provide a great example of one such success in an earlier post entitled "Today's Orphan Drug Could Be Tomorrow's Blockbuster"). Companies can first focus on patients with a certain combination of comorbidities for which an indication might be appropriate, and then roll out additional research, without investing the enormous amount of time and money in trying to make a "one-size-fits-all" product, which, at the end of the day, doesn't usually 'fit'. Rather than limiting themselves to one specific therapeutic area, companies could look at a variety of areas where they can make a difference, and focus a certain percentage of their efforts on demonstrating greater economic and clinical value for specific target populations.

Not only will this create public good, but it will also help get back to the roots of the pharmaceutical industry -- innovation and novel products that have kept the U.S. at the forefront of the global pharmaceutical industry. While this specialized focus might not eclipse blockbuster success, it will certainly improve the chance of the industry maintaining its status, as the blockbuster model has seen better days.

Orphan drugs come with some additional advantages to the traditional blockbuster approach. For one thing, the cost of development is significantly lower, particularly given that clinical trials can be smaller (i.e., Phase III trials can be conducted with less than 1000 enrollees). This is a more hospitable environment than the current normal drug path. There are also other attractive benefits like fast track approval, tax credits, and PDUFA fees waived for orphan drugs. And while evidence clearly must still exist -- FDA won't cut corners -- when truly life-threatening conditions are at stake, it may mean that there will be more "wiggle room" in getting a product to market, especially when there are no alternative treatments.

There are, of course, some potential challenges that pharmaceutical companies will need to overcome. Most notably, increased competition draws increased scrutiny from the FDA and Congress, who question whether drugs are really orphan drugs with so much money being invested. There is potential for legislative restraints in the future, though these are likely to just be a part of doing business and an inevitable market force that will play itself out.

Additionally, payers have pushed back on orphan drugs. They still look at FDA approval, safety and efficacy, and the population for which the drugs are intended. No payer will say it won't cover the approved drug (particularly if it's the only drug out there), as this would result in tremendous public backlash. However, because of the price tag, there is great (and reasonable) concern over whether the drugs work. More scrutiny on efficacy is likely to come with orphan drugs, but, ironically, the flipside can be seen with "ultra-orphan" drugs (which target an extremely small population). For these drugs, the total spend may not be the same as a typical branded pharmaceutical, so they are unlikely to get the same attention.

Ultimately, payers are still concerned about cost. Through focus on real world evidence, there will be a lot of opportunity to prove viability of particular drugs. As a result, pharmaceutical companies that are able to monitor and track patients to make sure they see positive effects will see great success with endeavors in orphan drugs. The opportunity to engage in post-marketing studies will prove to be beneficial for both patients and for R&D. But some companies are better prepared for this than others. It will be crucial to establish these capabilities to make this work.

There is a significant opportunity for pharmaceutical companies to revitalize their bottom lines and benefit society at the same time. But attention must be placed on changing the way they do business. Patient engagement must change to make them a more integral part of the process. Real world evidence is at the heart of the success of orphan drugs, particularly as companies have the potential to utilize a "rolling blockbuster" model through more focused research. Today's orphan drug could be tomorrow's blockbuster!

Allow me to riff on a point that Jim Pinkerton made in a recent blog post on Medicare, that the parties are competing to accuse each other of being the ones to "betray" Medicare through cuts.

Big Data.jpg

Jim has made the point elsewhere, as have I (here), that you can't cut your way out of the large fiscal cliff that we're facing as an aging population demands more health care, becomes more prone to devastating diseases like Alzheimer's, and becomes eligible for expensive nursing home care.

Is there any way out? The President's strategy is to adopt across the board Medicare cuts and then leverage them through expert advisory panels like the Independent Payment Advisory Board, which will strong arm Congress to make yet more politically unppalatable cuts (we'll see how well that works in the long run).

Indeed, Medicare's own actuary expects that these cuts are unsustainable. The conservative strategy, which I support, is to move to more market-based arrangements, where seniors (and all Americans) choose among competing private insurance plans, with some protections for ensuring that everyone can afford at least basic coverage.

But this, by itself, won't save us from the tsunami of rising health care costs we've mentioned before. So what would? Jim suggests a "cure strategy" to conquer expensive and debilitating diseaeses - along the lines of a vaccine for polio or smallpox - which would be both popular and effective in terms of lowering costs. Jim writes that:

If either party, Republican or Democratic, were leading with a "Cure Strategy," as opposed to a "Cut Strategy," it's hard to see how they would be suffering at the polls as a result.

That is, who in America would have voted against the Democrats in 2010 if Dems had announced a crash effort to eliminate, say, Alzheimer's? Similarly, who would be voting against Republicans today if the GOP had made the same cure-Alzheimer's argument in 2012?

If the answer, in both cases, is "no one," then you have to wonder why neither party chose to advance that cure-first argument.

Thankfully, the private sector is already ahead of the politicos. Innovative companies are already harnessing "big data" to drive large improvements in how we diagnose, treat, and (eventually) even prevent disease. Take GNS Healthcare, profiled in this month's Burill Report

By applying artificial intelligence and increasingly sophisticated software algorithms, modern health data analytics companies like GNS are using integrated data sources, such as electronic health records and genomics data, to move beyond historical, retrospective reporting toward real-time, predictive analysis. It's an approach that is driving healthcare into a future in which data analytics will be utilized at every point of care...

The company's main product, its Reverse Engineering and Forward Simulation platform, uses a supercomputer-backed framework to automate the extraction of causal network models directly from observational data. It then uses high-throughput simulations to generate new insights about disease starting, in effect, with no hypothesis, just data.

The GNS strategy - along with many other companies exploring the same space - will push health care towards both towards more automation (which means lower labor costs as artificial intelligence helps streamline treatment and diagnosis of complex diseases) and towards more personalized treatments for patients (and more personalized treatments=fewer wasted treatments and better health outcomes).

So why not build a cures strategy to help this vision become a reality faster? Advancing this argument requires both political parties to go against their instincts.

Democrats would have to admit that private companies are the only entities that are nimble enough to actually implement a "cures strategy", and they're going to make an awful lot of money doing it (even though it will save lots of money in the long run and spur economic growth).

Republicans are used to touting market cures, but are somewhat less adept in articulating the case for the places where government can get things right: investments in basic research (NIH), getting the patent and intellectual property regimes correct, and serving as a "neutral ground" where all of the critical parties could come together to hash out the key issues involved (How do we handle patient privacy concerns? What to do about the inevtiable lawsuits? Etc.)

Finally, the companies working in this sector who have the biggest potential to blow up the status quo, the biotech and pharmaceutical companies, the innovative start ups like GNS, and non-traditional health care operators (like retail clinics) are relative political lightweights.

Legacy health care players like nursing homes, hospitals, and (to a lesser extent) doctors carry more votes and are often more organized (think: SEIU 1199). This translates into much greater ability to set the terms of the debate when Washington sits down to write legislation.

In short, the political inertia is enormous to argue about what the legacy players care about, i.e., what they're paid for operating in today's system. This leaves both parties fighting over reimbursement strategies and formulas (private insurance v. public, increasing taxes v. entitlement reform), rather than thinking about what's around the corner and how to get there.

To be fair, plenty of very smart people in government and the private sector are thinking and talking about the evolution of personalized medicine. It just hasn't bubbled up into our politics yet as a topline issue.

But we can help make it a topline issue. Ultimately, I think Jim's cure strategy would work very well with the kinds of consumer-driven and patient-driven strategies that conservatives traditionally embrace. It should appeal to Mainstreet as well as Wall Street, because the gains, in human and economic terms are mind boggling. Save grandmother from Alzheimer's, and save yourself and your kids along the way.

So we need to do a lot more talking about not just what's wrong with the current system, but how to build a future that will make our current obsessions obsolete. (The same way that curing polio left iron lung wards in hospitals obsolete.)

Last but not least, we should remember that our competitors - Singapore, China, and the EU, are scrambling to try and develop the same cures we are. If we don't invent the "cures strategy" here, someone else undoubtedly will. And while that will be just as good for global health, I think that the U.S. is positioned to do it much faster and more efficiently, and (all other things being equal) I'd prefer the U.S. to spearhead the strategy and reap the economic benefits.

Advancing medical innovation is a cure both for the diseases that afflict us and the economic woes that beset us. As a matter of political optics and strategy, it is a low-hanging fruit that remains stubbornly unplucked.

Jim, back to you.

We've been talking about this on MPT for a while, but a few weeks ago Eric Topol, director of the Scripps Translational Institute, and author of the new book The Creative Destruction of Medicine, explains why we should and can eliminate long, expensive, and cumbersome randomized clinical trials in the age of genomics and targeted diagnostics:

We have this big thing about evidence-based medicine and, of course, the sanctimonious randomized, placebo-controlled clinical trial. Well, that's great if one can do that, but often we're talking about needing thousands, if not tens of thousands, of patients for these types of clinical trials. And things are changing so fast with respect to medicine and, for example, genomically guided interventions that it's going to become increasingly difficult to justify these very large clinical trials.

For example, there was a drug trial for melanoma and the mutation of BRAF, which is the gene that is found in about 60% of people with malignant melanoma. When that trial was done, there was a placebo control, and there was a big ethical charge asking whether it is justifiable to have a body count. This was a matched drug for the biology underpinning metastatic melanoma, which is essentially a fatal condition within 1 year, and researchers were giving some individuals a placebo.

Would we even do that kind of trial in the future when we now have such elegant matching of the biological defect and the specific drug intervention? A remarkable example of a trial of the future was announced in May.[1] For this trial, the National Institutes of Health is working with [Banner Alzheimer's Institute] in Arizona, the University of Antioquia in Colombia, and Genentech to have a specific mutation studied in a large extended family living in the country of Colombia in South America. There is a family of 8000 individuals who have the so-called Paisa mutation, a presenilin gene mutation, which results in every member of this family developing dementia in their 40s.

Researchers will be testing a drug that binds amyloid, a monoclonal antibody, in just [300][1] family members. They're not following these patients out to the point of where they get dementia. Instead, they are using surrogate markers to see whether or not the process of developing Alzheimer's can be blocked using this drug. This is an exciting way in which we can study treatments that can potentially prevent Alzheimer's in a very well-demarcated, very restricted population with a genetic defect, and then branch out to a much broader population of people who are at risk for Alzheimer's. These are the types of trials of the future and, in fact, it would be great if we could get rid of the randomization and the placebo-controlled era going forward.

One of things that I've been trying to push is that we need a different position at the FDA. Now, we can find great efficacy, but the problem is that establishing safety often also requires thousands, or tens of thousands, of patients. That is not going to happen in the contrived clinical trial world. We need to get to the real world and into this digital world where we would have electronic surveillance of every single patient who is admitted and enrolled in a trial. Why can't we do that? Why can't we have conditional approval for a new drug or device or even a diagnostic test, and then monitor that very carefully. Then we can grant, if the data are supported, final approval.


Of course, I think this is a splendid idea. It would slash drug development times and allow patient's much faster access to therapies that were matched to the underlying biochemistry of their disease.

Cancer is the area where we're seeing this "molecular hammer, meet molecular nail" approach develop fastest. Take Seattle Genetics drug Adcetris, a CD30 inhibitor approved by the FDA in 2011 for Hodgkin's lymphoma and anaplastic large cell lymphoma (ALCL).

If you're a cancer researcher, the first thing you want to know is how many other cancers overexpress CD30? It turns out, according to Xconomy, that researchers at MD Anderson and Stanford started looking at Adcetris as a treatment for a disease that wasn't even on Seattle Genetics radar screen, cutaneous T-cell lymphoma (CTCL). If researchers get a hit, and they did, Seattle Genetics can then turn around and run a larger trial to confirm the benefit.

But here's the rub. Assuming you've got a molecular hammer like Adcetris, and the molecular taxonomy of your disease - maybe it's cancer, maybe it's something else - implicates CD30 you know that you've got a high likelihood of some efficacy. Do you want to wait for a Phase III trial? No. Are you going to want to go into a placebo controlled trial? Or even a standard of care trial where you might get an untargeted treatment with serious side effects? No. And is your physician likely to prescribe the drug to you off label anyway? You betcha.

In these circumstances, the randomized clinical trial just doesn't make much sense. It's going to be overtaken rapidly by patients who know their own genomes, and the diagnostic tools that allow them to run N=1 trials that will allow them to rapidly screen drugs that might help them battle these diseases.

The key here is that you want a map of all the diseases that implicates this gene (or really, constellations of genes), and then you want to test the drug in these populations and find out what happens. Capturing that information and rapidly distributing it will be the coin of the genetic realm, leading to success for patients, regulators, and companies.

Safety, as Topol suggests, is something that we'll follow in the postmarket, because we'll have a confirmatory efficacy signal very early on with targeted therapies. Right now, the requirement for large trials that parse increasingly rare safety signals is the Berlin Wall facing drug developers, particularly for chronic diseases like obesity.

For more thoughts on the development of precision medicine, which is what we're really talking about, see this and this.

When my sister Paula was treated for Acute Myeloid Leukemia a few years ago, her doctors did not have any of the latest miracle cures available (such as Gleevec, a drug that has literally revolutionized the treatment of Chronic Myelogenous Leukemia since its 2001 approval). Instead, they had to rely on the broad-spectrum chemotherapy drugs daunorubicin and cytarabine -- old stalwarts on our war on cancer that date to the 1960s.

Chemotherapy agents work by poisoning all the quickly dividing cells in the patients body, whether they're cancerous or healthy. When they're effective, they kill all or nearly all the cancerous cells before they kill the patient. But they inevitably come with a raft of very serious and often (temporarily) disabling side effects. Like many patients undergoing chemotherapy, Paula felt for months on end as though the treatment may have been worse than succumbing to the disease.

That's why the move by medical science into more targeted cancer therapies -- ones that either deliver a therapeutic dose to a specific or localized site in the body or consist of molecules specially designed to bind only with certain cell types -- has been hailed so broadly. Benefits include fewer or smaller doses given to the patient, greater confidence that the drug will find and destroy cancerous cells, and, perhaps most importantly to the patients, fewer and less severe side effects. And as the fields of genomics, proteomics, and metabolomics advance at a lightening pace, we are quickly learning much more about what makes cancers unique and how to target them effectively.

Unfortunately, the very high hopes we have for targeted drug therapies in these early days in their development are all too frequently accompanied by disappointment (see here and here) as one targeted therapy after another has proven to be ineffective or far less potent than we once imagined they would be. Experience is mixed, to be sure, and a handful of targeted therapies, such as Gleevec, have proven to be real breakthroughs. As the New York Times detailed two years ago in a three-part series of articles, the now-approved drug Zelboraf (then being tested in clinical trials under the moniker PLX4032) "produced seemingly miraculous results in some patients with [metastatic] melanoma" and a very specific genetic mutation.

However, in spite of the mixed clinical trial results related to efficacy, a new study published in August edition of the journal Annals of Oncology has found that targeted drugs appear to be living up to the hype with regard to safety. On average, patients in phase I clinical trials of targeted cancer therapies experience a markedly lower rate of the most severe (grade 3 and 4) adverse events associated with drug toxicity and fewer and less severe physical side effects than do patients undergoing traditional chemotherapy.

The study analysed data from 687 patients in 36 Phase I trials on a variety of different cancer types. And the findings offer some genuine hope to patients. "The theory behind targeted drugs is that they should affect only cancer cells that have a specific fault and spare healthy cells, which we hoped would lead to higher rates of efficacy and lower rates of side-effects," the study's lead author, Rhoda Molife of the Royal Marsden NHS Foundation Trust near London, told World Pharma News. "It's very pleasing that our study seems to back this up, at least in the context of Phase I trials."

Of course, the news isn't all good. World Pharma News also reports that, "for targeted drugs, the most common toxicities were gastrointestinal -- such as loss of appetite, diarrhoea and vomiting -- and fatigue, while side-effects for cytotoxic drugs are generally haematological or cardiovascular in nature." So, although targeted cancer therapies have fewer side effects, being treated with them is still no walk in the park. But the study's findings are one more bright spot in our slow but steady march to conquer the Emperor of all Maladies.

That's the conclusion of an article in the Financial Times earlier this week that highlights the growing importance of companion diagnostics for the pharmaceutical industry.

Traditionally, medicines have been given to large numbers of patients with an apparently common disease, all the while accepting that they will be a failure for many and cause significant side effects. Genetic testing identifies the smaller numbers of sufferers in whom the drugs work, reducing costly and ineffective treatment in others.

The article notes that twenty years ago, cancer drugs might only be effective in ten percent of the patients treated. Today, new diagnostics linked to drugs like Erbitux, used to treat colon cancer, can identify the 60 percent of patients without a mutation in the KRAS gene, which makes them more likely to respond to treatment. The diagnostic spares patients who don't benefit the risk of serious side effects, and ensures that drug spending goes to the patients who are most likely to benefit. It also allows drugmakers to charge higher prices to offset the tremendous costs of developing sophisticated new medicines.


The development of personalized medicines still faces significant hurdles. Cancers typically develop resistance to even targeted drugs, effectively evolving in real time to bring new cancer promoting growth mechanisms online. Companies and researchers need to do more to understand the complex signaling networks driving cancer growth and mutation, and develop cocktails that can check or slow the development of drug resistant cancers.

Over time, diagnostics willl likely shift away from single gene mutations and towards more complex proteomic (and other "-omics") tests to measure these network interactions and target cocktail therapies appropriately from the initiation of cancer treatment. Some of these cancer roadmaps are already in development for leukemia.

And cancer is far from the only example: it has also become increasingly apparent that complex chronic diseases are rarely the result of a single gene mutation, but are driven by networks of complex gene and protein interactions. Scaling up our understanding of these networks and translating them into the clinic will be an enormous undertaking - and will be unworkable if taking a single new drug to market still takes over a decade and costs over a billion dollars.

We're starting to see the first glimmers of resarch networks take shape that can take advantage of new genomic technologies and sophisticated IT architecture - through, for instance, the NIH's cancer Biomedical Informatics Grid, defined by "information liquidity" and breaking down the "invisible wall" that has traditinally separated the research and treatment communities.

There's a long way to go before the full promise of personalized medicine is realized.

But there's no going back.

New York Times reporter Gina Kolata has written three moving articles on the slow but inexorable (thanks to increasingly powerful "omics" technologies and plummeting prices) journey of whole genome sequencing and proteomics from academic laboratories into the frontlines of cancer treatment.

In her first article, Kolata writes about a young cancer researcher, Dr. Lukas Wartman, struck by one of the very diseases that he had hoped to spend his career researching, adult acute lymphoblastic leukemia.

Fortunately, Wartman happened to work at a cutting edge genomics institute at the Washington University of St. Louis, and his colleagues pooled their talents and tools to try and unravel the driving force of his cancer and find a way to halt it before it killed him.

Wartman also turns out to be "lucky" in one other key respect - his cancer is driven by the "upregulation" of a normal gene (FTL3), which was "wildly active" in his leukemia cells. He gets even luckier when his colleagues realize that there is already a drug approved to inhibit FTL3, Sutent. This particular story has a happy ending, with Wartman's cancer going into remission for a second time.

In her second article, Kolata chronicles a sadder outcome - the story of 69 year old Beth McDaniel, afflicted with a rare type of lymphoma. In McDaniel's case, whole genome sequencing of her cancer leads to only a brief reprieve, as her cancer rapidly develops resistance to another promising drug, Yervoy.

In her third, and final article in the series, Kolata, examines how emerging genetic tests can help predict cancer outcomes (in this case, for a type of ocular melanoma) but not necessarily help patients identify treatments that will change their prognosis.

Writing about emerging treatments at the cutting edge of medicine is a challenging affair. Those who benefit from new treatments and protocols may have better access to new technologies and treatments, whether by dint of employment, friendship, wealth, or connections. It seems like an arbitrary process, and it is.

But this is certainly no more or less arbitrary than the genetic "lottery" of cancer itself, which strikes the wealthy and affluent - like Steve Jobs and Christopher Hitchens - as well as the poor and indigent. And although Jobs and Hitchens were "early adopters" of genomic technologies they may not have benefitted from them and certainly weren't cured by them. And future patients undoubtedly benefit from the efforts of the wealthy afflicted to find cures for their own diseases (through, for instance, intiatives like the Michael J. Fox Foundation).

Cancer may be the most complex human disease, and the effort required to battle it - let alone defeat it - is staggering. And Kolata's series gives a real sense of that: even when we have the technology to identify the molecular drivers of cancer growth, we may not have any drugs that block those targets, or the cancer may rapidly grow resistant to targeted therapies.

But the underlying trends favor cancer patients and their doctors, not cancer. The cost of sequencing technologies is dropping, and dropping rapidly. The analysis of the reams of data generated by sequencing is the current bottleneck, but increasingly powerful computers and information technologies will eventually crack that problem as well. And as more patients have their tumors sequenced, researchers will develop much better "roadmaps" for use in developing cocktail therapies to control metastatic cancers.

The remaining challenges, like the enormous cost of drug development and the speed with which we can validate and test drugs against new targets, are real and serious problems. It may take years, or even decades, to overcome them. But overcome them we will.

In this case, a picture is really worth a thousand words.


The FDA's user fee legislation sailed through the Senate last week with enormous bipartisan support (92-4) - although you probably missed it since the vote was drowned out by the Supreme Court's decision on Obamacare two days later. BioCentury's Steve Usdin has a terrific article describing the key provisions of the legislation (subscription required).

The most important aspect of PDUFA V may be that Congress has signaled that the FDA should no longer be hamstrung by the chimera of "perfect safety" as it weighs the risks and benefits of new medicines. Usdin writes that

By enacting the FDA Safety and Innoavtion Act, Congress has explicitly embraced the notion that the appropriate goal of drug regulation is to ensure a positive balance of benefits and risks. It also has implicitly accepted the idea that the risks posed by the lack of therapeutic options must be part of the benefit/risk calculation. [emphasis added]

This is a vital message for the FDA to hear, since Congress has often sent the exact opposite message after post-market safety concerns emerged with drugs like Vioxx and Avandia.

In addition to ploughing $4 billion in user-fees back into the agency over 5 years (2013-2017), the legislation also contains a number of provisions designed to speed up drug development, improve agency transparency and communications with stakeholders, and advance regulatory science.

Among other things, it encourages the FDA to expand the use of Fast Track and Acclerated approval beyond HIV and cancer; elminates caps on conflict of interest waivers for FDA advisory committees; creates a new breakthrough therapies designation that is supposed to allow for expedited development and review of drugs for "serious and life threatening illnesses" that show significant improvement over existing treatments in early stage testing; expands PDUFA review deadlines by 60 days to allow for additional meetings between sponsors and FDA; and requires the FDA to establish a new risk/benefit framework for describing how reviewers are actually evaluating the benefits and risks of new medicines.

How successful will these initiatives be? No one can really say for sure. Skeptics can argue (plausibly) that the FDA already has all of the statutory power it needs to do these things already. But that may be beside the point. The "permission" from Congress may give the FDA's leadership the leverage it needs to push changes down to review staff. It will also fall to Congress to follow through with additional oversight to ensure that FDA meets Congress' goals for advancing innovation and patient access to more effective therapies.

To date, the agency is saying all the right things. "From the FDA's perspective," FDA Commissioner Margaret Hamburg emarked at the recent BIO meeting in Boston, "the critical challenge now becomes implementation...delivering both on expectations and delivering on the real-world opportunities that are presented to us through this agreement and reauthorization legislation."

There are other, broader reasons to be bullish on the future of medical innovation and more flexible regulation. The speed at which the science is advancing in fields like genomics is truly astonishing, and will only continue to accelerate as costs drop (faster than Moore's law).

First, like it or not, the U.S. will find itself in increased regulatory competition with countries in Europe and Asia to commercialize R&D investments and bring them to market as quickly as possible - or risk losing the jobs and tax revenues that come from those R&D investments.

Second, there is the real possibility that the FDA could find itself lagging behind the science of personalized medicine as hospitals and health systems, health IT companies, and drug developers mine new data made available by linking electronic health records with genomic and other data that can translate latent knowledge into improved treatment protocols, promising new drug targets/indications, and more personalized (and thus cost effective) therapies.

Take, for instance, this partnership between Oracle and Aurora Health Care in Wisconsin.

Three or so years ago, Alfred Tector, one of the state's pioneering heart surgeons, contacted Oracle Corp., the database software company, about drawing on Aurora Health Care's electronic health records to find potential candidates for clinical trials of new drugs and medical devices.

Last week, Oracle announced the result of that initial call: the Oracle Health Sciences Network.

The new service will enable drug companies and health systems to cull information, with patients' names and other identifying characteristics removed, from electronic health records and other databases to determine whether a health system has enough patients to participate in a clinical trial.

That initially could enable clinical trials to be done quicker. But, more importantly, if the service is successful, it could become part of the infrastructure needed to develop drugs targeted for patients with a specific genetic makeup and for other medical research, such as comparing the effectiveness of alternate treatments.

Linking the network (as Aurora is already doing) with biospecimens and tissue samples will help make the next logical leap - allowing companies to quickly develop and test hypothesis for emerging biomarkers, recruit patients for clinical trials, and rapidly develop "proof of concept" trials that the treatments work as advertised and have a significant treatment effect. (That is, in itself, a roadmap for developing "breatkthrough therapies.")

In this emerging model, pressures will rapidly build on the FDA - from industry, academic medical centers, patients' groups, and others - for the agency to lead, follow, or just get out of the way. That is a paradigm shift that the FDA isn't in place to grapple with yet, but PDUFA V gives it Congress' imprimatur - and some new tools - for adapting to the new environment.

The challenge of FDA modernization shouldn't be underestimated. But science moves at its own pace. It'll be left for the regulators to catch up.

Though we often think of "personalized medicine" as cutting-edge, technologically advanced medicine tailored to an individual's particular biochemistry, an article in today's Wall Street Journal about the blood-thinning drug Plavix offers an instructive example of how a more expansive conception of personalized medicine is essential to improving patient outcomes.

As the article notes, Plavix tends to be a very effective treatment for the blood clotting that often leads to heart attacks, but it doesn't work for everybody. Roughly 30% of people have a gene variation that limits their body's responsiveness to the drug, hindering its effectiveness. As a result, some people end up taking Plavix without enjoying its salutary benefits. As genetic testing becomes more common and less expensive, it will become easier to identify who these people are and to move them to more effective treatments (or adopt different dosing strategies to overcome the genetic resistance).

But the article also offers a simpler explanation for why patients on Plavix don't always benefit from the drug: they simply forget to take it. Research into patient compliance suggests that "50% of heart patients stop taking important medications within a year of their initial prescription."

Both of these factors--the presence of drug-inhibiting genes and the tendency of patients to stray from their prescribed drug regimens for chronic diseases--can profoundly affect patient outcomes. And though high-tech genetic research has the potential to expand our knowledge of drug effectiveness tremendously, we should not forget that any tool - from a telephone call from a nurse practitioner to an email, or an app on your iPhone - used to improve a particular individual's health constitutes a kind of "personalized medicine."

Simple solutions like reminding patients to take their medicine could ultimately prove just as helpful in saving lives as advanced technology that delves into the complex relationship between genes and drug effectiveness.

Fostering the development of "personalized medicine" therefore does not just mean investing in R&D. It also means creating financial incentives for insurance companies and health care providers to look after their patients' health in more personal ways that are designed to maintain health rather than just to treat illness. The technologically adept among us can also make use of iPhone apps such as "RxmindMe," a free service that alerts individuals when they need to take their prescriptions.

Low-tech forms of personalized medicine are just as crucial as their high-tech counterparts. A full embrace of personalized medicine demands that we appreciate the potential that lies in both.

Yesterday's Wall Street Journal featured an article citing new studies that challenge the longstanding perception that hormone-replacement therapy (HRT) is an exceedingly risky treatment for women suffering from menopausal symptoms. This perception itself became widespread and entrenched roughly ten years ago following a government study conducted by the Women's Health Initiative that abruptly ended when data showed that women using HRT had higher rates of heart disease, stroke, and breast cancer than other menopausal women taking a placebo. However, new studies featured in the journal of the International Menopause society, Climacteric, indicate that some women using HRT can in fact benefit greatly from it; these women using HRT generally enjoy a relief from menopause symptoms as well as other significant health benefits, most notably a reduced rate of heart disease.

What separates these studies, and what led them to reach such widely divergent conclusions, is that they tested the effectiveness of HRT for women in marginally different age groups. The study published by the Women's Health Initiative primarily examined the effects of HRT on women who had long been in menopause--according to The Wall Street Journal, the average woman who participated in the Women's Health Initiative study had already been in menopause for 12 years. However, the Climacteric studies seem to corroborate the "window-of-opportunity" theory, which predicts that women who begin using HRT treatment before 60 or within ten years of menopause are more likely to experience health benefits than those who use HRT later in life.

The lesson to be taken from these studies is that medical research is done best when personalized and tailored to individual patients. The sweeping conclusions reached by the Women's Health Initiative a decade ago concerning the purported dangers of HRT for menopausal women only actually hold for a particular subset of menopausal women, and the health benefits of HRT for women who don't fall in that subset can be tremendous.

This case thus illustrates a serious danger latent in the "comparative effectiveness research" approach taken by the Patient Protection and Affordable Care Act: mass studies of how different treatments affect large swaths of people are not likely to pick up the subtleties and nuances in treatment effectiveness that can vary from person to person based on a whole range factors that determine each individual's unique biochemistry. These diverging studies further illustrate that the most significant advances in medical research, and by extension in medicine itself, are likely to be patient-oriented, but these advances may not be realized if we continue to support overbroad, one-size-fits-all "comparative effectiveness research" as our method for evaluating medical treatments.

I think it's arguable that the FDA doesn't need explicit authority from Congress to develop more flexible regulations for the most promising therapies early in drug development. The FDA has at least some regulatory flexibility already, under the 1997 Food and Drug Modernization Act.

But let's get to that thought in a moment.

MSNBC reports that the FDA is explicitly endorsing a provision in both the House and Senate versions of pending legislation that reauthorizes FDA user fee programs for drugs, devices, generic drugs, and biosimilars. (And kudos to Senators Bennet, Burr, and Hatch, who introduced the legislation back in late March.)

Experimental drugs that show a big effect early in development for treating serious or life-threatening diseases would get a faster and cheaper path to U.S. approval, under a proposal likely to become law this year.

U.S. drug regulators would be able to label such treatments "breakthrough" therapies, and work with companies to speed up clinical trials, for example by testing the drugs for a shorter time or enrolling fewer patients.

The U.S. Food and Drug Administration has said it supports the proposal, which is included in both versions of an FDA "must-pass" funding bill currently working its way through Congress and set to be passed by the end of the summer. ...

Dr. Janet Woodcock, head of the FDA's drugs center, has said the FDA needs more flexibility to bypass "business as usual" when it sees unexpected effects, or when a new medicine can greatly help patients.

"What happens when you have a breakthrough drug that shows an effect that's never been seen before?" she told reporters in March, discussing the proposal.

"If we'd done business as usual during the AIDS epidemic, we would have never controlled that epidemic," Woodcock said.

This is all to the good, as I noted in another blog post.

On the other hand, it also reinforces an underappreciated reality: the FDA looks over its shoulder at Congress when it reviews and approves new medicines, and develops new drug approval mechanisms.

This is only natural. The FDA is in the news only when there is drug safety problem, at which point it will get savaged by Congressional committees for not predicting and preventing every potential safety problem in advance.

Sadly, the FDA isn't going to be called up to the Hill to be congratulated on new drug approvals, or for streamlining the drug development process in general. So the pressure on the FDA tends to go only in one direction - towards requiring more data, more clinical trials, and more tests. This, in turn, drives up the costs of drug development and leads to delays in patient access to new medicines.

(To be fair, FDA aside, public and private payers in the U.S. and Europe are also demanding more data from companies to justify premium pricing in markets that are increasingly crowded with cheap, relatively safe, and effective generic drugs. The public is also increasingly wary about side effects from medicines for chronic illnesses that patients may take for years or decades. So the agency is only one factor in the development equation, albeit one of the most decisive ones.)

In short, the environment on the Hill often isn't very hospitable to regulatory innovation.

Still, the FDA is empowered to set standards for "adequate and well controlled trials", and Congress has explicitly given the agency the authority (under the 1997 FDA Modernization Act) to approve drugs based on single arm trials, but it still almost always requires two placebo controlled trials for drug approval.

In other words, the FDA has plenty of discretion. What it needs from Congress - and from the public - is permission to exercise that discretion and to do so with confidence that policymakers will not excoriate them when something goes awry. (And something will always go awry eventually, because neither medical science nor human beings are perfect.)

On that front, there's a lot to like in the current FDA user fee reauthorization. On both the House and Senate side, Congress has broadened the accelerated approval pathway and directs the FDA to embrace new technologies like biomarkers.

To be sure, FDA leadership hasn't embraced every proposal for updating its drug development toolkit - it expressly opposed the first version of Senator Hagan's TREAT bill - but its recent support for the breakthrough therapies designation is a very welcome sign that agency leadership knows that it needs Congress' permission to stop doing "business as usual" and help drive a more flexible mindset among its own reviewer staff.

AIDS was a very visible crisis. The challenges we face today are more subtle and longer term - unsustainable health care costs, an aging population that will become increasingly vulnerable to chronic diseases like Alzheimer's and cancer, and a drug development pipeline that is floundering. But they also demand a rethinking of the entire drug development and approval process - starting with the FDA and it's stakeholders.

Since FDA is a regulatory body, Congress must take the lead role in defining the policies that frame and provide effective oversight for the FDA's regulatory functions. The House and Senate should be applauded for for crafting user-fee legislation that can facilitate and acclerate access to more innovative treatments for millions of American patients.

Now, it is up to the FDA to show that it can embrace and create real change.

keep in touch     Follow Us on Twitter  Facebook  Facebook

Our Research

Rhetoric and Reality—The Obamacare Evaluation Project: Cost
by Paul Howard, Yevgeniy Feyman, March 2013

Warning: mysql_connect(): Unknown MySQL server host '' (2) in /home/medicalp/public_html/incs/reports_home.php on line 17
Unknown MySQL server host '' (2)


American Council on Science and Health
in the Pipeline
Reason – Peter Suderman
WSJ Health Blog
The Hill’s Healthwatch
Forbes ScienceBiz
The Apothecary
Marginal Revolution
Megan McArdle
LifeSci VC
Critical Condition
In Vivo Blog
Pharma Strategy Blog
Drug Discovery Opinion