Blog Post December 20, 2019

Genomic medicine: Into the 2020s and beyond

Doctor with Patient, Genomic Medicine

It’s been a particularly profound decade for genomics and its emergence in medical practice, as the theoretical has become real. At the same time, progress in some areas revealed additional layers of biological complexity instead of medically useful information, and much of its predicted benefit remains unrealized in the clinic.

So what has changed since the beginning of the decade?

2010-2012 – Exomes and engineering

Seven years removed from the publication of the first human genome sequence, a lot of excited talk had yielded… well, very little clinically. The cost of sequencing was plunging, down to about $50k per sequence from roughly $10M just a few years prior. But the total costs were still too high for feasible large-scale clinical genomics, and a single genome took quite a while to sequence and analyze. Also, almost all sequencing done was short read sequencing, in which the genome is chopped into small segments, then reassembled. For most of the genome, it worked well, and was cheaper, faster and more accurate than other sequencing methods. At the same time, some genomic regions, including highly repetitive sequences, centromeres and telomeres, remained inaccessible. The genomic data that began to emerge provided a substantial finding almost immediately, however: far more people needed to be sequenced to start making sense of the data.

In July 2010, Margaret Hamburg, then commissioner of the FDA, and Francis Collins, director of the NIH then and now, published “The Path to Personalized Medicine” in the New England Journal of Medicine. It makes interesting reading now, as many of the concerns and goals they identify haven’t changed all that much. They opine: “We recognize that myriad obstacles must be overcome to achieve these goals. These include scientific challenges, such as determining which genetic markers have the most clinical significance, limiting the off-target effects of gene-based therapies, and conducting clinical studies to identify genetic variants that are correlated with a drug response.” Over the decade there have been Herculean efforts to meet these scientific challenges, but many persist. They also decried the explosive growth of unregulated genetic testing, which has actually expanded in the decade since and has now entered the consumer marketplace, with unsubstantiated claims and associations all-too-common.

As was common at the time, Hamburg and Collins largely equated what was then called personalized medicine with “steering patients to the right drug at the right dose at the right time.” That goal is now associated with pharmacogenomics, just one component of precision or genomic medicine. The field writ large was indeed on the cusp of real clinical progress, but the rarity of whole human genome sequences and the high noise to signal ratio in the data required clinicians to narrow their focus. As a result, the initial impact was mostly seen in two areas: rare, often monogenic, diseases, and cancers with known mutational drivers.

Early clinical sequencing

Between 2010 and 2012, pioneering clinical programs at Baylor Medical College, Geisinger Health Systems, Intermountain Healthcare, Genomics England, and others, launched clinical whole genome, exome (protein coding regions) and gene panel sequencing programs to help diagnose rare diseases and guide cancer therapy. It was still unknown what the vast majority of genetic variants actually did or how they combined with other factors in health and disease, and the utility and value of such efforts were hotly debated. Indeed, physicians and pundits were writing op-eds decrying the lack of progress and opposing proposals to implement genomic medicine. At the same time, early data supported the notion that exome sequencing was valuable for patients with undiagnosed rare diseases, consistently yielding a diagnosis rate of about 25% right out of the gate. And while many cancers did not yet have targeted therapies, cancer gene panels helped determine what therapies were likely to ineffective, cutting down on the trial-and-error process common in oncology at the time. Importantly, all the efforts also contributed to the growth of human genome data resources, which provided more insight year by year.

The advent of genetic engineering

In 2010, it was possible to “engineer” genomes by cutting them in a single place with zinc finger nuclease- or TALEN-based methods. They were highly accurate, but limited, slow, expensive and difficult to implement. Then, in 2012, researchers working with Clustered Regularly Interspaced Short Palindromic Repeat (CRISPR) sequences, an anti-viral system found in bacteria, engineered components of the system to cut DNA at precise areas of the genome. The implications soon became clear, especially after subsequent research showed that the system worked in all kinds of cells, including human. And not only could genes be deleted or disrupted, they could be repaired. What would happen if a deleterious mutation could be edited out and the correct sequence inserted? It was still strictly laboratory-based work, but not surprisingly use of CRISPR-based methods exploded as researchers attempted to learn more.

2018-2019 – Delivery and dissemination

Fast forward to the last couple of years, as the end of the decade draws nigh. Do we have “the right drug at the right dose at the right time”? Sometimes. Targeted therapies guided by biomarkers are relatively common now, particularly in oncology, and many more are in clinical trial. Also, the arguments about pros and cons mostly concern cost and access—many of these therapies are sold at astronomical prices per dose—not effectiveness. Does the argument about the utility of genomics in medicine continue? Sometimes. But there is a profound change in what the basic argument is about. What used to be “we’ll see what happens if we’re able to implement exome testing (for example) in the clinic” is now “this is what happened when we implemented exome testing, and based on our data, here’s what we think is the best way to move forward with it.” Others may disagree about the best way forward, but the reality of genomics in the clinic is now pretty much a given.

Therefore, it’s fair to say that at least part of Hamburg and Collins’ vision has come to fruition. But significant obstacles remain, many of which are related to delivering what is still a complicated, expensive product, especially in less affluent countries as well as within the U.S. healthcare system. Some vertical healthcare systems within the U.S., including the pioneers Geisinger and Intermountain, have been able to be quite aggressive in implementing sequencing and analysis to large numbers of their patients and even healthy subjects, with promising results. And Genomics England is expanding their program as well within the U.K.’s National Health Service. But in many places, expense and access problems have limited growth.

Into the 2020s

So what are the next steps? In “2020 Vision: Predictions of what may shape precision medicine,” six experts opine on the topic. Interestingly, there are no common themes between them. Some identify methods that were mere pipe dreams in 2010, such as single-cell analyses and liquid biopsies for circulating cancer DNA, as driving progress for patients. Another thinks that genomic sequencing will go from mainstream to commonplace in countries such as the U.K. and will provide further insight into the medical utility of large amounts of human sequencing data. And while the use of CRISPR has presented challenging ethical issues, particularly around human germline experimentation, it’s almost ready to be applied in the clinic. Delivering the editing apparatus to sufficient numbers of mature cells is a huge challenge—current protocols involve using viral vectors to deliver it—but genetic editing is being tested in accessible tissues, such as in the retina to cure vision disorders.

As previously mentioned, along with the progress there has also been insight into unexpected biological complexity. For example, advances in long-read sequencing has helped reveal the importance of structural variants (SVs) such as duplications and inversions in human genomic variability and disease. Because the sequence isn’t changed, short-read methods often fail to detect SVs, but they are an important consideration for any application of clinical genomics. Also, non-coding regions of the genome, which comprise ~98.5% of the sequence, are now understood to play vital regulatory roles in gene expression. Indeed, most of the genomic areas associated with common complex diseases (think cancer, Alzheimer’s disease, type 2 diabetes, etc.) are in non-coding regions, and most of them remain poorly understood. Therefore, some of the most important disease research areas remain challenging to address.

It’s a fascinating time, as the tools available to researchers may finally approach the power needed to explore the massive systemic complexity of biology that has previously stymied progress in many areas. Translating knowledge to medical progress will remain difficult—it always will be—but the sheer amount of knowledge will grow with increasing rapidity. In the end, it’s reasonable to expect that medicine in 2030 will be far different from what we have now, and we’ll look back on the 2010s as the decade that laid the foundation for true progress.