ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

Complexity analysis of algorithms: A case study on bioinformatics tools

Journal: World Journal of Biology and Biotechnology (Vol.6, No. 3)

Publication Date:

Authors : ;

Page : 11-13

Keywords : Time complexity; Computational tools; Empirical analysis.;

Source : Downloadexternal Find it from : Google Scholarexternal

Abstract

The data volume produced by the omic sciences nowadays was driven by the adoption of new generation sequencing platforms, popularly called NGS (Next Generation Sequencing). Among the analysis performed with this data, we can mention: mapping, genome assembly, genome annotation, pangenomic analysis, quality control, redundancy removal, among others. When it comes to redundancy removal analysis, it is worth noting the existence of several tools that perform this task, with proven accuracy through their scientific publications, but they lack criteria related to algorithmic complexity. Thus, this work aims to perform an algorithmic complexity analysis in computational tools for removing redundancy of raw reads from the DNA sequencing process, through empirical analysis. The analysis was performed with sixteen raw reads datasets. The datasets were processed with the following tools: MarDRe, NGSReadsTreatment, ParDRe, FastUniq, and BioSeqZip, and analyzed using the R statistical platform, through the GuessCompx package. The results demonstrate that the BioSeqZip and ParDRe tools present less complexity in this analysis

Last modified: 2022-08-29 20:01:44