On November 12th and 13th, the Food and Drug Administration hosted genome scientists from across the nation at its campus in White Oak, Maryland. Two public workshops engaged presenters and audience members on various technical aspects of translating Next-Generation Sequencing (NGS) into clinical practice, including issues related to analytical performance evaluation, bioinformatics strategies, and the use of genetic databases to establish clinical validity of a test. Although at this stage, FDA has not developed a new regulatory framework for NGS that is distinct from the process for more traditional diagnostic tests, the Agency’s stated goal is to institute “appropriate oversight, in a way that is more suitable to the complexity and data-richness of this new technology.” These activities are being driven by the President’s Precision Medicine Initiative, as well as health care market realities that patients and their physicians want to choose the right treatment, at the right time, with hopefully minimal side effects.

FDA also is actively working to advance the science needed to develop necessary standards for NGS, as evident from the launch of its new crowd-sourced, cloud-based platform called precisionFDA — billed as “a community platform for NGS assay evaluation and regulatory science exploration.” The site (https://precision.fda.gov/) launched the night before the NGS workshops began, according to FDA’s Chief Health Informatics Officer Dr. Taha Kass-Hout, who announced the availability of the resource. Dr. Kass-Hout encouraged all scientists and companies working on NGS technologies to check out precisionFDA and provide feedback or contribute reference materials before the beta launch on December 15, calling it the genomic community’s version of a “sandbox to play in.” Following the beta launch, he noted, FDA will be scaling the environment and identifying challenges that need to be addressed next year.

Dr. Kass-Hout also presented to workshop attendees an overview of the precisionFDA environment. Most importantly, to address potential intellectual property concerns associated with data-sharing on the platform, precisionFDA has been built with both a community work area and secure, independent work areas that companies can keep private and use for their own purposes. Uploaded data will default to private until a user decides to share those data outside of their own group. FDA informaticians will operate only in the community work area, and users can share data with the entire community or with a more limited group of collaborators. In addition to the private and “sharing” areas, features of precisionFDA include Apps like sample mapping and simulation tools; Comparisons (i.e., between a user’s test sample and a selected reference sample); and Notes. Dr. Kass-Hout was particularly enthusiastic about the Notes feature, which he explained will offer participants the ability to share information in a rich-text format.

In addition to the Agency’s news about precisionFDA going live this week, the two days of NGS workshops gave stakeholders from industry, government, medicine, and academia an opportunity to discuss and find common ground on certain issues that FDA has been examining as it considers appropriate regulatory approaches to NGS platforms and tests. Some of the key themes that emerged from the panel discussions and audience questions include:

  • Elements of both design concept standards and performance standards should be leveraged as the community develops processes for analytical validation of NGS tests. Design concept standards and performance standards represent two ends of a spectrum, and a hybrid or combination approach may be the best way to address inherent limitations in each. Everyone agreed that standards are necessary even though the process of designing them will be very difficult, and there was general agreement that those standards should be applied to reference materials that are used by the entire community. Such standards also should address pre-analytical, analytical, and post-analytical phases of testing, including clinical interpretation.
  • Participants cited the need for transparency; a common language or nomenclature; and more reference materials (renewable or otherwise) available to everyone to help move the field forward. Transparency will be important in multiple areas, such as for standard operating procedures, analytical methods, and validation studies. The need for public validation data also was discussed in the context of bioinformatics tools, and participants deliberated how quickly and when to replace people with software and automated processes. There seemed to be consensus that human involvement to catch errors and other problems continues to be necessary and, indeed, is desirable at this stage in the industry’s development.
  • There was widespread support for a regulatory approach that would rely on a “data commons” to establish clinical validity of genetic variants during the development of a new NGS test. And although there was agreement on general requirements for gene variant databases — such as the need for a particular database to be well-curated and of high quality, both in terms of the data sets themselves and all of the operations of the database — the most effective way to meet those goals was the subject of extensive debate. Participants shared differing views and discussed features of the National Institutes of Health-curated database, ClinVar, to elaborate on desirable and undesirable characteristics of databases that could be applied to clinical interpretation of variants. Among other topics examined was whether and how test results are communicated to clinicians, as well as who bears responsibility for educating clinicians (and consumers) about genetic variations that are uncovered.

The November workshops build on discussions during a similar event held on February 20, 2015 to explore new regulatory approaches for NGS platforms and tests. Presentation slides, a full transcript, and an archived webcast of the February meeting can be accessed here.