BioPerspectives - GEN - Genetic Engineering and Biotechnology News https://www.genengnews.com/category/topics/bioperspectives/ Leading the way in life science technologies Thu, 27 Jul 2023 23:34:24 +0000 en-US hourly 1 https://wordpress.org/?v=6.3 https://www.genengnews.com/wp-content/uploads/2018/10/cropped-GEN_App_Icon_1024x1024-1-150x150.png BioPerspectives - GEN - Genetic Engineering and Biotechnology News https://www.genengnews.com/category/topics/bioperspectives/ 32 32 Forge Biologics and OBiO Technology Sign Separate Gene Therapy Deals https://www.genengnews.com/topics/omics/forge-biologics-and-obio-technology-sign-separate-gene-therapy-deals/ Thu, 29 Jun 2023 12:00:41 +0000 https://www.genengnews.com/?p=266135 A strategic agreement combines the capacity and capabilities of Shanghai-based OBiO process development scientists with enabling platforms and support from Univercells Technologies, according to officials at both companies. Separately, the New Hope Research Foundation and Forge Biologics signed a development and cGMP manufacturing partnership agreement to advance the Foundation’s novel gene therapy, NHR01, into Phase I/II clinical trials for patients with GM2 gangliosidosis.

The post Forge Biologics and OBiO Technology Sign Separate Gene Therapy Deals appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
OBiO Technology, a gene and cell therapy CDMO in China, and Univercells Technologies, entered into a strategic agreement to deploy novel technologies in Shanghai. The NevoLine Upstream biomanufacturing platform with scale-X bioreactors will now be offered in OBiO’s recently expanded (+828,00 ft2) GMP Shanghai facility to accelerate gene therapy manufacturing.

The strategic agreement combines the capacity and capabilities of OBiO process development scientists with enabling platforms and support from Univercells Technologies, according to officials at both companies, who add that the combined know-how and capabilities will put OBiO in the position to offer GMP manufacturing services from R&D to commercial stages to gene therapy developers.

“Bringing gene therapies to market safely requires reliable, scalable manufacturing technologies. We were intrigued by the potential and became early adopters of the scale-X technology in 2020. Our recently published poster details the successful scale-up of an oncolytic Herpes Simplex Virus Type-1 (HSV-1) vector from scale-X hydro (2.4 m²) to scale-X carbo bioreactor (30 m²), with  significant reductions in manpower, materials and space requirements,” said Jia Guodong, CEO, OBiO.

“We will pursue this comprehensive collaboration with Univercells Technologies in multi products, conducting process development in the advanced fixed-bed bioreactor not only for HSV, but also for Lentiviral vector, Adeno-associated virus, and other cell and gene therapy modalities.”

“As we enter this agreement, the teams are already planning to scale up the oncolytic HSV-1 process in the 600m² scale-X nitro integrated in the continuous NevoLine Upstream platform,” added Florence Vicaire, CCO, Univercells Technologies. We look forward to further increasing their OBiO’s capabilities to meet the growing demand for gene therapies at a reduced cost, in China and globally.”

cGMP manufacturing partnership

Separately, The New Hope Research Foundation, a nonprofit organization dedicated to finding a genetic cure for GM2 gangliosidosis (including Tay-Sachs) and other lysosomal storage diseases, and Forge Biologics signed a development and cGMP manufacturing partnership agreement to advance the Foundation’s novel gene therapy, NHR01, into Phase I/II clinical trials for patients with GM2 gangliosidosis.

According to Timothy J. Miller, PhD, CEO, president, and co-founder of Forge Biologics, “Forge will provide adeno-associated virus (AAV) process development, analytical services, and cGMP manufacturing. The Foundation will leverage Forge’s platform processes, including its proprietary HEK293 suspension Ignition Cells™, to accelerate the initial production. All development and AAV manufacturing activities will occur at the Hearth, Forge’s 200,000 ft2 gene therapy facility in Columbus, OH.”

“We look forward to embarking on our manufacturing collaboration with their experienced team and tried and true platform process to help accelerate our therapy into clinical trials and deliver new hope for patients with GM2 gangliosidosis,” said Jack Keimel, co-founder, and president of New Hope Research Foundation.

The post Forge Biologics and OBiO Technology Sign Separate Gene Therapy Deals appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
Samsung Biologics Signs Biosimilar Manufacturing Deal with Pfizer https://www.genengnews.com/news/samsung-biologics-signs-biosimilar-manufacturing-deal-with-pfizer/ Fri, 09 Jun 2023 12:00:43 +0000 https://www.genengnews.com/?p=264398 Under the terms of the new agreement, Samsung Biologics will provide Pfizer with additional capacity for large-scale manufacturing for a multi-product biosimilars portfolio covering oncology, inflammation, and immunology. Samsung will use its newest facility, Plant 4, which was completed earlier this month, for the manufacturing of products. Samsung Biologics and Pfizer entered into an initial manufacturing agreement in March 2023 for a Pfizer product.

The post Samsung Biologics Signs Biosimilar Manufacturing Deal with Pfizer appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
Samsung Biologics formed a strategic partnership for the long-term commercial manufacturing of Pfizer’s multi-product portfolio. Samsung Biologics and Pfizer entered into an initial manufacturing agreement in March 2023 for a Pfizer product.

 

Under the terms of the new agreement, Samsung Biologics will provide Pfizer with additional capacity for large-scale manufacturing for a multi-product biosimilars portfolio covering oncology, inflammation, and immunology. Samsung will use its newest facility, Plant 4, for the manufacturing of products. 

 

“We are pleased to extend the strategic collaboration with Pfizer as we share and support their strong vision to bring innovative solutions for patients around the globe,” said John Rim, president and CEO of Samsung Biologics. “This new meaningful partnership comes just as our Plant 4 is fully completed early this month as we had previously committed and are on the move for future expansion into our second campus in order to provide our clients with even more flexible and advanced manufacturing technology.” 

 

“Pfizer is excited to continue our strategic partnership with Samsung Biologics that aims to enable greater access to medicines for more patients across the world,” said Mike McDermott, chief global supply officer, executive vp, Pfizer. “This commitment is a reflection of Pfizer’s trust in the Korean pharmaceutical industry to address emerging health challenges.” 

 

The post Samsung Biologics Signs Biosimilar Manufacturing Deal with Pfizer appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
Applying Complex Proteome Profile Methods to Large Sample Cohorts https://www.genengnews.com/topics/bioperspectives/applying-complex-proteome-profile-methods-to-large-sample-cohorts-2/ Mon, 23 Jul 2018 12:05:00 +0000 https://stage.genengnews.com/uncategorized/applying-complex-proteome-profile-methods-to-large-sample-cohorts-2/ Two videos from Thermo Fisher Scientific on translational proteomics.

The post Applying Complex Proteome Profile Methods to Large Sample Cohorts appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
January 1, 1970 (Vol. , No. )

<Sponsored Content>


Large Scale Proteomics

Recent advances in mass spectrometry have skyrocketed the capabilities in translational proteomics, impacting our understanding of health and disease. Translational proteomics complements other omics disciplines (genomics, transcriptomics, and metabolomics/lipidomics), delivering new workflows that produce clinically relevant results that are quantitative, reproducible, standardized, and scalable. Thermo Scientific Orbitrap LC-MS workflows lead the way to accelerate your journey from discovery research to clinical applications, and support your success every step of the way. 

Applying Complex Proteome Profile Methods to Large Sample Cohorts
Dr. Andreas Huhmer, Director of Marketing for Proteomics and Metabolomics at Thermo Fisher Scientific, discusses two analytical proteomics workflows have been developed that deliver reproducibility and standardization across the clinical research field.
Resource

Applying Precursor Level Quantitation to Large-Scale Clinical Proteomics Research
Dr. Jun Qu, Professor of Pharmaceutical Sciences at State University of NY, Buffalo, discusses precursor level quantitation, and fewer missing values for precise and sensitive quantitative results.

The post Applying Complex Proteome Profile Methods to Large Sample Cohorts appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
Applying Complex Proteome Profile Methods to Large Sample Cohorts https://www.genengnews.com/resources/applying-complex-proteome-profile-methods-to-large-sample-cohorts/ Mon, 23 Jul 2018 12:05:00 +0000 https://stage.genengnews.com/uncategorized/applying-complex-proteome-profile-methods-to-large-sample-cohorts/ Two videos from Thermo Fisher Scientific on translational proteomics.

The post Applying Complex Proteome Profile Methods to Large Sample Cohorts appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
<Sponsored Content>

Large Scale Proteomics

Recent advances in mass spectrometry have skyrocketed the capabilities in translational proteomics, impacting our understanding of health and disease. Translational proteomics complements other omics disciplines (genomics, transcriptomics, and metabolomics/lipidomics), delivering new workflows that produce clinically relevant results that are quantitative, reproducible, standardized, and scalable. Thermo Scientific Orbitrap LC-MS workflows lead the way to accelerate your journey from discovery research to clinical applications, and support your success every step of the way.

Applying Complex Proteome Profile Methods to Large Sample Cohorts
Dr. Andreas Huhmer, Director of Marketing for Proteomics and Metabolomics at Thermo Fisher Scientific, discusses two analytical proteomics workflows have been developed that deliver reproducibility and standardization across the clinical research field.
ResourceApplying Precursor Level Quantitation to Large-Scale Clinical Proteomics Research
Dr. Jun Qu, Professor of Pharmaceutical Sciences at State University of NY, Buffalo, discusses precursor level quantitation, and fewer missing values for precise and sensitive quantitative results.

The post Applying Complex Proteome Profile Methods to Large Sample Cohorts appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
System Qualification, Inter-Sample, and Intra-Sample Quality Control https://www.genengnews.com/topics/bioperspectives/system-qualification-inter-sample-and-intra-sample-quality-control-2/ Mon, 16 Jul 2018 12:00:00 +0000 https://stage.genengnews.com/uncategorized/system-qualification-inter-sample-and-intra-sample-quality-control-2/ A new focus has been put into setting up the experimental design for translational studies and the resulting concepts that can then be transferred across labs and projects.

The post System Qualification, Inter-Sample, and Intra-Sample Quality Control appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
January 1, 1970 (Vol. , No. )

An Essential Approach for Translational Proteomics

<Sponsored Content>

 

We are in a new age of clinical research, where experimental design is moving from relying on cohort sizes of 2×2 and 10×10 for putative biomarker panel identification to hundreds or even thousands of samples. A new focus has been put into setting up the experimental design for translational studies and the resulting concepts that can then be transferred across labs and projects. The amount of time and cost associated with large-scale studies mandate high-quality data acquisition for every project.

Scientific Challenge

Translational and clinical research studies profile individuals against a cohort to mine for the presence of putative biomarkers. Study size has grown tremendously over the last five years to include hundreds and even thousands of samples. Increasing sample numbers for these studies create significant challenges when successfully assessing an original experimental hypothesis due to:

• Study duration, which can span potential interruption of data acquisition.
• Amount of data generated, which becomes challenging to process and interpret.
• Experiment set-up or method development, particularly within a consortium where transferability becomes critical for reproducibility.

Adding to these challenges, current analytical methods create a gap from bench studies to clinical application and from small to large-scale experiments. Quality control (QC) standards are not used correctly or tend to be tailored for specific applications and thus cannot be shared. With failure to assess reproducibility at each step of the analysis, methods cannot be verified by other labs. This issue directly causes the inability to bridge the translational gap and has resulted in circumstances where bench studies don’t make it to the clinic.


Approach to proteomic biomarker studies

Solution for Translational Proteomics

The use of an optimized, systematic, and standardized approach to proteomics experiments permits the direct comparison of results across experiments, projects, and laboratories.

Identifying and evaluating each step within a workflow allows the inclusion of QC steps along the way. Incorporating standardized QC into sample collection and storage, sample preparation, system suitability testing, and sample analyses ensures data integrity, and also reproducible analyses. Commercially available standards that are externally validated, support this uniformity across workflows.

Once QC methods are in place, system suitability and performance can be measured within the study as well as post-study and determine systematic and experimental variance to facilitate more accurate quantification of biological variance. Successful QC methods can then be qualified and further implemented into subsequent translational studies on biological systems.

Future Plans

Harmonization of QC methods across studies has been observed with new studies in targeted metabolomics. Using systems such as Biocrates Absolute IDQ p180 that include calibration standards, QC Standards, integrated software, and clear QC metrics provide reliable QC tested commercial standards that can be applied to any analytical method related to metabolomics studies.

Steps toward reproducible quantitative analyses within and across labs implementing common QC samples, analytical standards such as SIL peptides, and biological QC standards such as reference pools of serum/plasma from NIST and Golden West, are improving experimental results and providing a solid foundation for further analysis, ensuring monies are not wasted reinventing the wheel. Further assistance from industry partners such as Thermo Fisher Scientific to help with workflow standardization and QC reporting make it easier for the individual lab to start discovery experiments with a high level of quality to ensure they are meaningful long-term.

The post System Qualification, Inter-Sample, and Intra-Sample Quality Control appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
System Qualification, Inter-Sample, and Intra-Sample Quality Control https://www.genengnews.com/resources/system-qualification-inter-sample-and-intra-sample-quality-control/ Mon, 16 Jul 2018 12:00:00 +0000 https://stage.genengnews.com/uncategorized/system-qualification-inter-sample-and-intra-sample-quality-control/ A new focus has been put into setting up the experimental design for translational studies and the resulting concepts that can then be transferred across labs and projects.

The post System Qualification, Inter-Sample, and Intra-Sample Quality Control appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
An Essential Approach for Translational Proteomics

<Sponsored Content>

We are in a new age of clinical research, where experimental design is moving from relying on cohort sizes of 2×2 and 10×10 for putative biomarker panel identification to hundreds or even thousands of samples. A new focus has been put into setting up the experimental design for translational studies and the resulting concepts that can then be transferred across labs and projects. The amount of time and cost associated with large-scale studies mandate high-quality data acquisition for every project.

Scientific Challenge

Translational and clinical research studies profile individuals against a cohort to mine for the presence of putative biomarkers. Study size has grown tremendously over the last five years to include hundreds and even thousands of samples. Increasing sample numbers for these studies create significant challenges when successfully assessing an original experimental hypothesis due to:

• Study duration, which can span potential interruption of data acquisition.
• Amount of data generated, which becomes challenging to process and interpret.
• Experiment set-up or method development, particularly within a consortium where transferability becomes critical for reproducibility.

Adding to these challenges, current analytical methods create a gap from bench studies to clinical application and from small to large-scale experiments. Quality control (QC) standards are not used correctly or tend to be tailored for specific applications and thus cannot be shared. With failure to assess reproducibility at each step of the analysis, methods cannot be verified by other labs. This issue directly causes the inability to bridge the translational gap and has resulted in circumstances where bench studies don’t make it to the clinic.

Approach to proteomic biomarker studies

Solution for Translational Proteomics

The use of an optimized, systematic, and standardized approach to proteomics experiments permits the direct comparison of results across experiments, projects, and laboratories.

Identifying and evaluating each step within a workflow allows the inclusion of QC steps along the way. Incorporating standardized QC into sample collection and storage, sample preparation, system suitability testing, and sample analyses ensures data integrity, and also reproducible analyses. Commercially available standards that are externally validated, support this uniformity across workflows.

Once QC methods are in place, system suitability and performance can be measured within the study as well as post-study and determine systematic and experimental variance to facilitate more accurate quantification of biological variance. Successful QC methods can then be qualified and further implemented into subsequent translational studies on biological systems.

Future Plans

Harmonization of QC methods across studies has been observed with new studies in targeted metabolomics. Using systems such as Biocrates Absolute IDQ p180 that include calibration standards, QC Standards, integrated software, and clear QC metrics provide reliable QC tested commercial standards that can be applied to any analytical method related to metabolomics studies.

Steps toward reproducible quantitative analyses within and across labs implementing common QC samples, analytical standards such as SIL peptides, and biological QC standards such as reference pools of serum/plasma from NIST and Golden West, are improving experimental results and providing a solid foundation for further analysis, ensuring monies are not wasted reinventing the wheel. Further assistance from industry partners such as Thermo Fisher Scientific to help with workflow standardization and QC reporting make it easier for the individual lab to start discovery experiments with a high level of quality to ensure they are meaningful long-term.

The post System Qualification, Inter-Sample, and Intra-Sample Quality Control appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
Pushing the Limits of Bottom-Up Proteomics https://www.genengnews.com/topics/bioperspectives/pushing-the-limits-of-bottom-up-proteomics-2/ Mon, 09 Jul 2018 11:40:00 +0000 https://stage.genengnews.com/uncategorized/pushing-the-limits-of-bottom-up-proteomics-2/ Understanding the dynamics of the proteome requires analyzing it across different conditions and time points throughout the cellular life cycle.

The post Pushing the Limits of Bottom-Up Proteomics appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
January 1, 1970 (Vol. , No. )

Daniel Lopez-Ferrer Ph.D. Thermo Fisher
Michael Blank Ph.D. Product Marketing Manager Thermo Fisher Scientific
Stephan Meding Ph.D. Thermo Fisher
Aran Paulus Ph.D. Thermo Fisher
Romain Huguet Ph.D. Thermo Fisher
Remco Swart Ph.D. Director Product Manager Thermo Fisher Scientific
Andreas FR Huhmer Ph.D. Thermo Fisher

Using State-Of-The-Art Capillary UHPLC and Orbitrap Mass Spectrometry for Reproducible Quantitation of Proteomes

<Sponsored Content>

Since its inception, bottom-up proteomics has aimed to identify and quantify the complete proteome from a cell, tissue, or whole organism. Although many advances have been made in the last 15 years, there are still three main challenges to overcome. The first is to obtain complete coverage of the proteome by identifying all the expressed proteins in a given time. The second is working with samples of limited amounts such as like clinical biopsies, and the third is achieving sufficient analytical throughput. Understanding the dynamics of the proteome requires analyzing it across different conditions and time points throughout the cellular life cycle. For many studies, this analysis needs to be performed in a high-throughput manner. To further complicate matters, being able to discriminate the most important proteins constituting a given cellular state requires accurate peptide measurement across several orders of magnitude.

To read the remainder of this article, click here and download the PDF.


Thermo Fisher app note. Use link to the left to download this PDF.

The post Pushing the Limits of Bottom-Up Proteomics appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
Pushing the Limits of Bottom-Up Proteomics https://www.genengnews.com/resources/pushing-the-limits-of-bottom-up-proteomics/ Mon, 09 Jul 2018 11:40:00 +0000 https://stage.genengnews.com/uncategorized/pushing-the-limits-of-bottom-up-proteomics/ Understanding the dynamics of the proteome requires analyzing it across different conditions and time points throughout the cellular life cycle.

The post Pushing the Limits of Bottom-Up Proteomics appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
Daniel Lopez-Ferrer Ph.D. Thermo Fisher
Michael Blank Ph.D. Product Marketing Manager Thermo Fisher Scientific
Stephan Meding Ph.D. Thermo Fisher
Aran Paulus Ph.D. Thermo Fisher
Romain Huguet Ph.D. Thermo Fisher
Remco Swart Ph.D. Director Product Manager Thermo Fisher Scientific
Andreas FR Huhmer Ph.D. Thermo Fisher

Using State-Of-The-Art Capillary UHPLC and Orbitrap Mass Spectrometry for Reproducible Quantitation of Proteomes

<Sponsored Content>

Since its inception, bottom-up proteomics has aimed to identify and quantify the complete proteome from a cell, tissue, or whole organism. Although many advances have been made in the last 15 years, there are still three main challenges to overcome. The first is to obtain complete coverage of the proteome by identifying all the expressed proteins in a given time. The second is working with samples of limited amounts such as like clinical biopsies, and the third is achieving sufficient analytical throughput. Understanding the dynamics of the proteome requires analyzing it across different conditions and time points throughout the cellular life cycle. For many studies, this analysis needs to be performed in a high-throughput manner. To further complicate matters, being able to discriminate the most important proteins constituting a given cellular state requires accurate peptide measurement across several orders of magnitude.

To read the remainder of this article, click here and download the PDF.


Thermo Fisher app note. Use link to the left to download this PDF.

The post Pushing the Limits of Bottom-Up Proteomics appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
The Translational Proteomics Workflows Driving Biomarker Research beyond Discovery https://www.genengnews.com/topics/bioperspectives/the-translational-proteomics-workflows-driving-biomarker-research-beyond-discovery-2/ Mon, 02 Jul 2018 13:00:00 +0000 https://stage.genengnews.com/uncategorized/the-translational-proteomics-workflows-driving-biomarker-research-beyond-discovery-2/ The latest translational proteomics workflows and LC-MS technologies are helping to overcome the bottlenecks in the biomarker pipeline.

The post The Translational Proteomics Workflows Driving Biomarker Research beyond Discovery appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
January 1, 1970 (Vol. , No. )

Emily I. Chen Ph.D. Senior Director Thermo Fisher

<Sponsored Content >

 

The field of proteomics has advanced considerably as technologies and workflows have steadily matured. Advances in liquid chromatography-mass spectrometry (LC-MS), now capable of high-throughput analysis and deep proteomic coverage, have increased the number of potential protein biomarkers identified in discovery studies. However, a significant translational gap exists between research activity and clinical practice: while the number of candidate biomarkers continues to rise, the number of FDA-approved diagnostic markers remains relatively low.1 Fortunately, the latest translational proteomics workflows and LC-MS technologies are helping to overcome the bottlenecks in the biomarker pipeline.

Label-Free DDA Biomarker Verification

LC-MS workflows are used across the discovery, verification and validation stages of the biomarker pipeline. But while success criteria for discovery efforts often tend towards proteome coverage and measurement sensitivity, these requirements evolve as the analytical focus shifts from identification to robust quantitation. As sample numbers grow to support the larger studies used to assess statistical significance and clinical utility, factors such as throughput, reproducibility, and scalability become increasingly important.

Label-free data-dependent acquisition (DDA) methods are commonplace for larger discovery and targeted verification stage workflows. These workflows, based on tandem MS analysis of the most abundant precursor ions, seek to minimize redundant peptide precursor selection and maximize proteome coverage. However, despite the widespread adoption of these workflows, the precision and reproducibility of conventional DDA methods have been problematic. For large-scale proteomics studies requiring easily standardized and transferable quantitative methods, this presents a significant challenge.

Ongoing advances in LC-MS technologies, such as more precise capillary flow techniques and novel column technologies, are helping to achieve more consistent results. These design improvements are resulting in superior analytical sensitivity and significantly minimized mobile phase dead volumes, leading to more stable peak areas and enhanced measurement reproducibility.

In addition to improving sample separation, advances in high-resolution accurate mass are also increasing the run-to-run reproducibility of biomarker verification workflows. The precision, sensitivity and mass accuracy of the latest generation of Orbitrap™ mass spectrometers are enabling the delivery of more comprehensive peptide coverage and better quantitative data. In turn, these advances are enabling greater inter-run consistency, facilitating easier transfer of standardized methods between instruments. Combining next-generation chromatographic and MS performance, these ‘DDA-plus’ workflows are increasing the quantitative power of analytical runs.

High-Resolution DIA Targeted Quantitation

Offering improved sensitivity over DDA methods, label-free data-independent acquisition (DIA) workflows are gaining traction as an alternative approach for biomarker verification. DIA workflows are based on the analysis of all peptide fragments isolated within consecutive isolation windows. As each MS/MS spectrum records the fragment ions from all co-eluting peptides within the predefined m/z precursor window, DIA methods offer significant multiplexing capacity and proteome-wide quantitation.

Despite the potential of DIA workflows, issues around selectivity and dynamic range have limited their use. The wide isolation windows employed in conventional DIA experiments collect data on multiple co-isolates, resulting in highly complex spectra. Quantification of complex matrixes such as clinical plasma samples is, therefore, challenging due to the peptide diversity and naturally broad dynamic range of plasma proteins.

High-resolution DIA workflows based on hybrid quadrupole-Orbitrap MS technologies are addressing the twin challenges of selectivity and dynamic range and increasing the quality of large-scale proteomics data. The exceptional resolution offered by these instruments means that much narrower acquisition windows can be used, allowing for improved precursor selectivity, quantitative reproducibility, and precision.

PRM Biomarker Validation

At the validation stages of the biomarker pipeline, sensitive and specific protein quantification is required. Conventional MS approaches for biomarker validation have generally been based on selected reaction monitoring (SRM) methods utilizing triple quadrupole instruments. However, due to the need to select the most intense product ions, the development of methods capable of sensitive protein quantitation can be complex.2

Parallel reaction monitoring (PRM) is an alternative approach for biomarker validation that is underpinned by hybrid triple quadrupole-Orbitrap technologies. PRM workflows offer outstanding levels of selectivity, sensitivity, and throughput, but generally, require less extensive assay development than SRM approaches. Moreover, the exceptional resolution offered by Orbitrap analyzers also provides higher specificity, enabling the confident detection of low abundance peptide targets in complex biological matrices.

Conventional PRM methods have been limited by the retention time variation of peptide targets caused by fluctuations in temperature, solvent and column conditions between runs. Although these effects can be minimized using wider retention time windows, improving measurement consistency often means compromising on the number of targets that can be monitored simultaneously.

Direct retention time PRM (dRT-PRM) is a new generation of PRM workflow’s designed to overcome this challenge. By enabling real-time monitoring and adjustment of retention time windows, and using internal calibration peptides to produce easily-recognized standard signals, dRT-PRM workflows allow for recalibration of the retention time during analysis, offering a more reproducible method for targeted protein quantitation.

Conclusion

The latest generation of high-throughput LC-MS workflows is supporting proteomics research at the discovery stage and beyond. By addressing the evolving needs of the translational pipeline, these workflows are helping to accelerate biomarker development and opening up new opportunities for precision medicine.

References
1. CE Parker and CH Borchers, Advances in mass spectrometry-based clinical biomarker discovery, Mol. Oncol., 2014, DOI: 10.1016/j.molonc.2014.03.006
2. GE Ronsein et al., Parallel reaction monitoring (PRM) and selected reaction monitoring (SRM) exhibit comparable linearity, dynamic range, and precision for targeted quantitative HDL proteomics, J. Proteomics, 2015, DOI: 10.1016/j.jprot.2014.10.017

 

Emily I. Chen, Ph.D., is Senior Director of the Thermo Fisher Precision Medicine Science Center and has over ten years of experience in mass spectrometry-based shotgun proteomics, oncology, and translational research. As the director of Herbert Irving Comprehensive Cancer Center Proteomics Shared Resource at Columbia University Medical Center, she led proteomics biomarker discovery efforts and interacted directly with physicians to support precision medicine projects.

The post The Translational Proteomics Workflows Driving Biomarker Research beyond Discovery appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
The Translational Proteomics Workflows Driving Biomarker Research beyond Discovery https://www.genengnews.com/resources/the-translational-proteomics-workflows-driving-biomarker-research-beyond-discovery/ Mon, 02 Jul 2018 13:00:00 +0000 https://stage.genengnews.com/uncategorized/the-translational-proteomics-workflows-driving-biomarker-research-beyond-discovery/ The latest translational proteomics workflows and LC-MS technologies are helping to overcome the bottlenecks in the biomarker pipeline.

The post The Translational Proteomics Workflows Driving Biomarker Research beyond Discovery appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>
Emily I. Chen Ph.D. Senior Director Thermo Fisher

<Sponsored Content >

 

The field of proteomics has advanced considerably as technologies and workflows have steadily matured. Advances in liquid chromatography-mass spectrometry (LC-MS), now capable of high-throughput analysis and deep proteomic coverage, have increased the number of potential protein biomarkers identified in discovery studies. However, a significant translational gap exists between research activity and clinical practice: while the number of candidate biomarkers continues to rise, the number of FDA-approved diagnostic markers remains relatively low.1 Fortunately, the latest translational proteomics workflows and LC-MS technologies are helping to overcome the bottlenecks in the biomarker pipeline.

Label-Free DDA Biomarker Verification

LC-MS workflows are used across the discovery, verification and validation stages of the biomarker pipeline. But while success criteria for discovery efforts often tend towards proteome coverage and measurement sensitivity, these requirements evolve as the analytical focus shifts from identification to robust quantitation. As sample numbers grow to support the larger studies used to assess statistical significance and clinical utility, factors such as throughput, reproducibility, and scalability become increasingly important.

Label-free data-dependent acquisition (DDA) methods are commonplace for larger discovery and targeted verification stage workflows. These workflows, based on tandem MS analysis of the most abundant precursor ions, seek to minimize redundant peptide precursor selection and maximize proteome coverage. However, despite the widespread adoption of these workflows, the precision and reproducibility of conventional DDA methods have been problematic. For large-scale proteomics studies requiring easily standardized and transferable quantitative methods, this presents a significant challenge.

Ongoing advances in LC-MS technologies, such as more precise capillary flow techniques and novel column technologies, are helping to achieve more consistent results. These design improvements are resulting in superior analytical sensitivity and significantly minimized mobile phase dead volumes, leading to more stable peak areas and enhanced measurement reproducibility.

In addition to improving sample separation, advances in high-resolution accurate mass are also increasing the run-to-run reproducibility of biomarker verification workflows. The precision, sensitivity and mass accuracy of the latest generation of Orbitrap™ mass spectrometers are enabling the delivery of more comprehensive peptide coverage and better quantitative data. In turn, these advances are enabling greater inter-run consistency, facilitating easier transfer of standardized methods between instruments. Combining next-generation chromatographic and MS performance, these ‘DDA-plus’ workflows are increasing the quantitative power of analytical runs.

High-Resolution DIA Targeted Quantitation

Offering improved sensitivity over DDA methods, label-free data-independent acquisition (DIA) workflows are gaining traction as an alternative approach for biomarker verification. DIA workflows are based on the analysis of all peptide fragments isolated within consecutive isolation windows. As each MS/MS spectrum records the fragment ions from all co-eluting peptides within the predefined m/z precursor window, DIA methods offer significant multiplexing capacity and proteome-wide quantitation.

Despite the potential of DIA workflows, issues around selectivity and dynamic range have limited their use. The wide isolation windows employed in conventional DIA experiments collect data on multiple co-isolates, resulting in highly complex spectra. Quantification of complex matrixes such as clinical plasma samples is, therefore, challenging due to the peptide diversity and naturally broad dynamic range of plasma proteins.

High-resolution DIA workflows based on hybrid quadrupole-Orbitrap MS technologies are addressing the twin challenges of selectivity and dynamic range and increasing the quality of large-scale proteomics data. The exceptional resolution offered by these instruments means that much narrower acquisition windows can be used, allowing for improved precursor selectivity, quantitative reproducibility, and precision.

PRM Biomarker Validation

At the validation stages of the biomarker pipeline, sensitive and specific protein quantification is required. Conventional MS approaches for biomarker validation have generally been based on selected reaction monitoring (SRM) methods utilizing triple quadrupole instruments. However, due to the need to select the most intense product ions, the development of methods capable of sensitive protein quantitation can be complex.2

Parallel reaction monitoring (PRM) is an alternative approach for biomarker validation that is underpinned by hybrid triple quadrupole-Orbitrap technologies. PRM workflows offer outstanding levels of selectivity, sensitivity, and throughput, but generally, require less extensive assay development than SRM approaches. Moreover, the exceptional resolution offered by Orbitrap analyzers also provides higher specificity, enabling the confident detection of low abundance peptide targets in complex biological matrices.

Conventional PRM methods have been limited by the retention time variation of peptide targets caused by fluctuations in temperature, solvent and column conditions between runs. Although these effects can be minimized using wider retention time windows, improving measurement consistency often means compromising on the number of targets that can be monitored simultaneously.

Direct retention time PRM (dRT-PRM) is a new generation of PRM workflow’s designed to overcome this challenge. By enabling real-time monitoring and adjustment of retention time windows, and using internal calibration peptides to produce easily-recognized standard signals, dRT-PRM workflows allow for recalibration of the retention time during analysis, offering a more reproducible method for targeted protein quantitation.

Conclusion

The latest generation of high-throughput LC-MS workflows is supporting proteomics research at the discovery stage and beyond. By addressing the evolving needs of the translational pipeline, these workflows are helping to accelerate biomarker development and opening up new opportunities for precision medicine.

References
1. CE Parker and CH Borchers, Advances in mass spectrometry-based clinical biomarker discovery, Mol. Oncol., 2014, DOI: 10.1016/j.molonc.2014.03.006
2. GE Ronsein et al., Parallel reaction monitoring (PRM) and selected reaction monitoring (SRM) exhibit comparable linearity, dynamic range, and precision for targeted quantitative HDL proteomics, J. Proteomics, 2015, DOI: 10.1016/j.jprot.2014.10.017

 

Emily I. Chen, Ph.D., is Senior Director of the Thermo Fisher Precision Medicine Science Center and has over ten years of experience in mass spectrometry-based shotgun proteomics, oncology, and translational research. As the director of Herbert Irving Comprehensive Cancer Center Proteomics Shared Resource at Columbia University Medical Center, she led proteomics biomarker discovery efforts and interacted directly with physicians to support precision medicine projects.

The post The Translational Proteomics Workflows Driving Biomarker Research beyond Discovery appeared first on GEN - Genetic Engineering and Biotechnology News.

]]>