Exploration of Data Mining Techniques in Record Deduplication
Journal: International Journal of Science and Research (IJSR) (Vol.2, No. 11)Publication Date: 2013-11-05
Authors : R. Gayathri; A. Malathi;
Page : 216-219
Keywords : Record Deduplication; preprocessing; Cleaning; Dirty data; genetic programming; mbat algorithm; firefly algorithm;
Abstract
In todays business world, the database plays a vital role in decision making. As the organization grows, the size of the database also gets increased. This enormous growth in the database size leads to a problem of dirty data. Dirty data is the replicated data in the database which causes some issues like performance degradation, increasing operational cost and the lack of quality. This can be removed by the process of record deduplication. The record deduplication refers to identifying the same entity with different representations. Further cleaning and removing of replica in the repository become a mandatory work. Thus this paper surveys some of the record deduplication approaches. Also it compares with three approaches to record deduplication such as genetic programming, Modified BAT algorithm, and firefly algorithm approach with its limitation and advantages on all the three got discussed.
Other Latest Articles
- Statistical Analysis of Factors that Influence Voter Response Using Factor Analysis and Principal Component Analysis
- Information Repackaging in Library Services
- Integration of a City GIS Data with Google Map API and Google Earth API for a Web Based 3D Geospatial Application
- Custom Power Improvement in Distribution System by Open UPQC
- Signal Denoising Using EMD and Hilbert Transform and Performance Evaluation with K-S Test
Last modified: 2021-06-30 20:23:15