The Big Data Revolution

Among the new catchy tech terms of the past decade is one with many lessons for the future - "Big Data." Essentially, it refers to datasets so large and complex, that processing them requires a huge amount of computing power. More importantly, big data brings new ways to model past trends more accurately and make ever more accurate predictions for the future.

The 2012 election should push big data even more into the spotlight - Obama's campaign team raised $1 billion not by repeating their 2008 approach, which was successful, but still flawed, but by taking a new "measure everything" approach. They tested whether phone calls from a swing state were more effective than calls from a non-swing state, whose emails performed best in which season, and allocated resources based on computer simulations of the election. To make it all work, the campaign hired an analytics team was twice the size of what it was in 2008, with a "chief scientist" who had experience crunching huge data. It was this fine-grained approach to campaigning that helped the President land his second term in office; something the GOP has yet to learn.

Big data also proved useful in predicting the election results - long criticized by both sides of the aisle, Nate Silver, a statistician and political analyst, correctly predicted every single state's votes this election season, and the 2008 election with extreme accuracy. Silver's model eschews the conventional wisdom of the industry that follows day-to-day poll results and instead uses various factors - economic and others - that have influenced election outcomes in the past, to make rational, and surprisingly accurate projections.

But big data isn't unique to political analysis - far from it.  In fact, its application in other fields may be even more intriguing and beneficial.

A natural candidate is the bio-pharmaceutical industry, where a huge drop-off in patents along with increased clinical trial costs have left a void that big data can help to fill.

Recently, GNS Healthcare, a big data analytics company, announced that it is partnering with Mount Sinai School of Medicine to develop a big-data-based computer model of multiple myeloma - a rare bone cancer that can require a bone marrow transplant to treat. Researchers will use the model to help study new potential targets and tailor prospective treatments to each patient.

This approach, which uses computerized models based on clinical data, offers a cost-effective, efficient way forward to developing new treatments for diseases, particularly orphan diseases - those that occur in fewer than 200,000 individuals in the U.S., making typical randomized clinical trials difficult and extraordinarily expensive.

While ten years ago, processing power would have been a roadblock, now only regulatory barriers exist. The FDA has yet to fully embrace the big data revolution, and still relies mostly on outmoded clinical trial guidelines (with the exception of some cancers and HIV drugs). Large-scale clinical trials were suitable for dealing with infectious diseases of the past, but complex chronic diseases - cancers, Alzheimer's, and various neurological ailments, require a new approach. Allowing drug developers to apply big data methods to find molecular targets and match them to patients based on electronic medical records will open a new frontier in drug development (see my colleague Paul Howard's discussion of personalized, individually tailored medicine and its application to oncology).

What are the hurdles here? Standardizing electronic medical records to carry genomic or other biomarker data; getting more patients to agree to allow their data to be "mined", with appropriate privacy protections; and building the databases that will allow practicing oncologists to access the latest research in real time.  Data, after all, is a two way street, flowing up from patients and doctors, but flowing back down again as researchers plumb cancer's complex molecular networks.   

Utilizing "Big Data" would allow drug companies to cut the costs of development and focus on many more prospective drug candidates, as well as weeding out ineffective compounds faster.  And better targeted medicine translates into better value for patients and payers.

Regulatory reform along these lines would also send a broader signal to pharmaceutical companies that the U.S. is committed to remaining at the cutting edge  of pharmaceutical development, ensuring that the associated benefits - first access to novel drugs and sustained domestic R&D - accrue to Americans.

Related Entries:

keep in touch     Follow Us on Twitter  Facebook  Facebook

Our Research

Rhetoric and Reality—The Obamacare Evaluation Project: Cost
by Paul Howard, Yevgeniy Feyman, March 2013

Warning: mysql_connect(): Unknown MySQL server host '' (2) in /home/medicalp/public_html/incs/reports_home.php on line 17
Unknown MySQL server host '' (2)


American Council on Science and Health
in the Pipeline
Reason – Peter Suderman
WSJ Health Blog
The Hill’s Healthwatch
Forbes ScienceBiz
The Apothecary
Marginal Revolution
Megan McArdle
LifeSci VC
Critical Condition
In Vivo Blog
Pharma Strategy Blog
Drug Discovery Opinion