Microbial detection and surveillance form the backbone of all systems currently used to control infectious diseases, including foodborne diseases, worldwide. However, surveillance is still typically targeted at a relatively limited number of specified microorganisms and diseases, and maybe more importantly there is a very significant global disparity in national disease detection systems and methodology. In particular, public health efforts in the food area are hampered by several obstacles:
- The use of different, specialized, expensive, and difficult-to-compare detection techniques
- The use of separate (between sectors) characterization systems for relevant pathogens
- The lack of efficient, reliable typing systems for most pathogens, hampering efficient disease attribution to food sources
- Poor collaboration between microbiological fields (Listeria , Salmonella , viruses)
- (Inter)national politics on the disclosure of surveillance and research data
- Intellectual property rights
- Lack of sufficient diagnostic capacities, particularly in developing countries.
A more effective and rational approach is needed and efforts to mitigate the effects of foodborne and other microbial threats, focusing on improved surveillance and diagnostic capabilities, are crucial in relation to bacterial as well as viral infections (Osterhaus and Smits, 2013).
While some national foodborne disease surveillance systems mainly collect information on the number of outbreaks and the number of cases involved in the outbreaks, for most foodborne diseases the majority of cases are sporadic cases. Therefore surveillance systems must include a recording of sporadic cases to enable a realistic estimation of the foodborne disease burden. New, active surveillance systems are likely in the future to blur the difference between what have traditionally been referred to as outbreaks and sporadic cases. The possibility of comparing genetically pathogen isolates from human cases in a broad national system enables the linking of cases, previously considered single, sporadic in nature, as truly part of an outbreak spread over a larger geographical area (Gerner-Smidt et al., 2005). It is thus likely that our understanding of the relative importance of outbreaks and sporadic cases will change in the near future.
As a result of the increased global trade in food it is also likely that outbreaks covering larger areas and affecting several countries will be recognized more in the future. Surveillance of foodborne diseases provides information for action. The use of laboratory data in surveillance enables the identification of pathogens and the potential sources of infection. Integrated surveillance including human data as well as animal- and food-monitoring data will in the future provide the basis for preventive action along the entire food production chain.
With recent technological advances and declining costs in the nextgeneration sequencing field, these tools will play an increasingly important role in the surveillance and identification of new and previously unrecognized pathogens in both animals and humans. Inherently, an enormous increase in the amount of published whole genome sequences (WGS) is to be expected, providing a wealth of information to aggregate, share, mine, and use to address global public health and clinical challenges.
Next-Generation Sequencing and Whole Genome Sequencing: A New Potential for Integrated Surveillance and Prevention of Foodborne Diseases
Surveillance is a key component of preparedness for the spread and development of infectious diseases, and is needed at the global level to monitor trends in endemic diseases (e.g., influenza, dengue, salmonellosis), to monitor eradication efforts (polio, measles, brucellosis), or to signal unusual disease activities (including new diseases or outbreak of known diseases). Molecular diagnostic tools based on short pieces of unique genome sequence (e.g., PCR and microarray [biochip] technologies) have been used routinely for surveillance in different areas. However, genomic re-assortment events may easily be missed if surveillance is relying on molecular diagnostic tools that target only small microbial genome fragments. Therefore, whole genome sequencing (WGS) provides a much more interesting, new methodology. WGS is a laboratory process that determines the complete genome sequence of an organism under study. This can have important implications; for instance, during the recent outbreak of MERS (Middle East Respiratory Syndrome) coronavirus in the Middle East, analysis of small genome fragments did not provide sufficient phylogenetic signal for reliable typing and separation of virus variants whereas WGS would enable such separation (Smits et al., 2015). Classically, whole microbial genome sequences were determined by PCR and Sanger sequencing. Nowadays, next-generation sequencing (NGS) techniques are used increasingly in human genomics, and are also widely used to identify and genotype microorganisms in almost any microbial setting (Wielinga and Schlundt, 2014).
NGS advancements and the development of NGS software tools have decreased the cost of WGS much faster than predicted 10 years ago. It is likely that the price of WGS analyses will decrease to a point where it can seriously compete with traditional routine diagnostic identification techniques, whereas the speed will seriously out-compete traditional methodology. The potential of WGS in the investigation and surveillance of infectious and foodborne diseases has been demonstrated in many studies, including the tracking and tracing of the cholera outbreak in Haiti in 2010 (Hendriksen et al., 2011), the EHEC (enterohemorrhagic E. coli ) outbreak starting in Germany in 2011 (Mellmann et al., 2011) and others (Allard et al., 2012). During the EHEC outbreak, scientists from around the globe performed NGS and shared their results for analysis. The collaboration between these researchers allowed for joint and rapid analysis of the genomic sequences, revealing important details about the involved new strain of E. coli , including why it demonstrated such high virulence. Similar collaborations have been created globally during emerging viral infections such as MERS coronavirus.
It is not the intention of this article to describe in any detail the different existing sequencing platforms that are used for NGS studies and now also increasingly in routine laboratory. Illumina, Ion Torrent, Pac-Bio, Roche and Nanopore are but a few of the platforms available. For a more thorough description of these platforms and their respective strengths and weaknesses, go to Fournier et al. (2014). In general, a number of advantages can be attributed to these new platforms and to the use of NGS in general in food microbiology. Time and cost are typically mentioned as advantages, because the time to achieve a full bacterial genome is now estimated at 24– 48 h, so even if another 24 h is added for the bioinformatics part, we would arrive at a full typing result in 48– 72 h. Traditional typing can easily take more than a week with existing microbiological methodologies. Discriminatory power of NGS in separating isolates down to clonal level is also often mentioned as an advantage, and it seems clear that if the sequencing is performed flawlessly NGS performance in this area is clearly ahead of traditional typing methodology and will open new potential both in microbiological characterization and in epidemiological understanding. Unfortunately, sequencers do, however, not yet perform flawlessly and herein lies probably one of the most important present limitations in NGS. Errors in sequencing and in genome assembly will lead to— potentially serious— errors in identification outcome. However, one of the most important advantages of NGS is the fact that these DNA sequences can be collected and collated in big national, regional, or even global databases, so we at some stage will actually have libraries of the full DNA of all existing microorganisms. This basically means that we will be able to create a global machine for the identification and characterization of all microorganisms.