
CSO, Co-founder
seqWell
The landscape of next-generation sequencing (NGS) continues to be defined by astonishing technological progress. We continue to witness sequencer throughput expansion with systems like the Illumina NovaSeq
X+ and Ultima Genomics UG100
in the short-read space, as well as the PacBio Revio
and Oxford Nanopore PromethION® in the long-read space. Surprisingly, even with these profound advances, overall throughput and scalability of genomics remained a central challenge for the scientific community in 2025.
This seeming paradox stems from a critical shift: the bottleneck has migrated from sequence throughput to library preparation throughput. While the sequencers themselves act as powerful engines, the increasing complexity and limitations inherent in upstream sample handling and NGS library chemistry continue to hold back the capacity to generate the scale of data required by modern genetics and biotechnology. Understanding and overcoming these operational and economic hurdles is essential for unlocking the next wave of high-throughput applications and delivering the promised benefits of genomic technology across the biotechnology landscape.
Library prep’s scalability challenge
The core challenge in realizing massive NGS throughput now lies in the fundamental design of existing library preparation (library prep) workflows. Historically, many of these tools were developed to address low-plex applications in short-read sequencing––such as the sequencing of one or only a few human genomes or RNA-seq experiments in a single run. These methods have become inadequate for today’s throughput demands, and will likely continue to be, for assays involving many hundreds or even thousands of samples per run.
There is an ongoing effort to determine how to eliminate certain steps within the existing library prep process and drive process improvements by reducing operational complexity. Across many applications, simpler library prep workflows translate to fewer steps in the lab and lower chances for errors, as well as reduce labor and reagent costs while greatly accelerating the time to sequence data. One dimension of library prep that has large potential for improvement is in simplifying the steps of multiplexing and normalization.
The ability to multiplex is a key driver of high-throughput genomics and necessitates sample normalization: having a predictable way of ensuring that individual samples––often 100s or 1000s of samples—that differ in starting concentrations can be converted into sequencing libraries and sequenced at defined levels with consistent quantity and quality of data. The absence or lack of normalization severely restricts sequencing throughput by reducing the number of samples that can be reliably multiplexed together on the same sequencing run, increasing the risk that data from some samples will be lost or negatively impacted. We are seeing new methods of normalization chemistry within library prep, for example, by using tagmentation (transposase-based fragmentation and indexing), or ligation-based normalization, or fluorometric-controlled PCR, that are showing promise in eliminating normalization as a separate step in library prep workflows.
Here, innovations include being able to index or add sequence barcodes early in workflows to permit earlier pooling and reducing downstream handling burdens. Tagmentation extends this concept by removing the need for a separate step to fragment DNA molecules before adding sequencing adapters.
Continued automation reliance
Genomics continues to aggressively integrate automation to overcome the inherent complexity and time sink of manual or semi-automated library prep. Increased adoption of automation continues to drive throughput across all aspects of the workflow. This trend isn’t merely about speed; it’s about addressing fundamental resource consumption and cost pressures.
- Labor reduction: Automated liquid handling platforms continue to offer a critical means of eliminating or at least lowering the costs associated with labor, which can represent a significant portion of laboratory budgets. Additionally, by minimizing reliance on manual labor, the opportunity for operator errors is also reduced.
- Chemistry miniaturization: Automation also enables the miniaturization of chemistry to reduce costs even further.
- Advanced dispensing technology: Tools like acoustic technology are attractive because they can miniaturize dispensing while simultaneously using fewer tips.
Economic and sustainability hurdles
As throughput scales dramatically, the operational and financial challenges independent of the sequencer become starkly apparent. This economic reality applies to both academic research and the biopharmaceutical sector, where budget reductions have been seen throughout 2025. Reducing not just the cost of library kit reagents but also the labor and consumable needs of library prep workflows can make or break the decision of a project being financially workable.
Increased throughputs have generally led to massive increases in the use of plastics, particularly tips. This explosion in plastic consumption presents a dual challenge: it is a significant environmental concern and a substantial cost burden.
Thus, being mindful of expenses related to consumables becomes essential. Consequently, automation methods that enable miniaturized reactions for decreased reagent costs as well as those that minimize tip usage are highly attractive when organizations are trying to squeeze every penny out of these assays.
Within the general life sciences market, we continued to see discussions about sustainability.
Applications pushing the boundaries
The availability of high-throughput sequencing has fundamentally enabled new fields of study that were previously considered impossible––single-cell and spatial biology are notable, quickly growing examples. These fields, in turn, are now demanding even greater sequencing capacity, driving the market for faster and more efficient library preparation.
Several core applications in applied biotechnology rely on high-throughput sequencing for essential quality control (QC) and development:
- Synthetic biology (SynBio): This field depends heavily on the need to synthesize and verify novel DNA sequences for cell and gene engineering applications.
- Cell and gene therapy: The growing field of gene and genome editing has spurred the rapid growth of NGS-based QC assays that help ensure the reliable use of these technologies in human therapeutics development.
- QC for biotherapeutics: Development of biologics such as antibodies or mRNA-based biotherapeutics often involves sequencing plasmid vectors and amplicons to screen and verify clone sequences.
Historically, these quality control and screening applications were conducted using Sanger sequencing but are now being migrated to other short-read sequencing or long-read technologies like ONT or PacBio sequencing, necessitating the development of library workflows that can handle the large number of required samples.
Scalability needs innovation
The throughput of next-generation sequencing is no longer constrained by the central instrument, but by factors often considered to be on the periphery—the demanding, costly, and complex library preparation workflow. To keep pace with the needs of modern biology, gene therapy, and expanding commercial sectors like pharmaceuticals and AgBio, the scientific community must commit to simplifying existing chemistries, embracing automation technologies like acoustic dispensing, and tackling the escalating economic and environmental burden of plastic consumption.
By focusing innovation on making library preparation simpler, cheaper, and more automated, the limitations currently holding back the analysis of thousands of samples per run can be overcome, fully realizing the potential of NGS in modern genetics and biotechnology.
Joe Mellor, PhD, is the CSO and co-founder of seqWell.
The post The Throughput Paradox: Why Next-Generation Sequencing Bottlenecks Persist Beyond the Sequencer appeared first on GEN – Genetic Engineering and Biotechnology News.
