ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

Progressive Detection of Duplicate Data

Journal: International Journal of Science and Research (IJSR) (Vol.6, No. 1)

Publication Date:

Authors : ; ;

Page : 1647-1649

Keywords : Duplicate detection; entity resolution; progressiveness; and data cleaning;

Source : Downloadexternal Find it from : Google Scholarexternal

Abstract

Data duplicate detection is the process of identifying multiple representations of same or real world entities. Nowadays, data duplicate detection methods are needed to process larger datasets in shorter time maintaining the quality of the datasets and also the entities duplicated becomes increasingly difficult. This application focus on the duplicates in hierarchical datas like XML file. The data can be detected using the detection methods. Here the datasets are loaded in the applications and the processing, extraction, cleaning, separation and detection are carried out to remove the duplicated data. Comprehensive experiments show that our progressive algorithms can double the efficiency over time of traditional duplicate detection and significantly improve upon related work

Last modified: 2021-06-30 17:35:27