ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

TEXT COMPRESSION ALGORITHMS - A COMPARATIVE STUDY

Journal: ICTACT Journal on Communication Technology (IJCT) (Vol.2, No. 4)

Publication Date:

Authors : ; ;

Page : 444-451

Keywords : Encoding; Decoding; Lossless Compression; Dictionary Methods;

Source : Downloadexternal Find it from : Google Scholarexternal

Abstract

Data Compression may be defined as the science and art of the representation of information in a crisply condensed form. For decades, Data compression has been one of the critical enabling technologies for the ongoing digital multimedia revolution. There are a lot of data compression algorithms which are available to compress files of different formats. This paper provides a survey of different basic lossless data compression algorithms. Experimental results and comparisons of the lossless compression algorithms using Statistical compression techniques and Dictionary based compression techniques were performed on text data. Among the Statistical coding techniques, the algorithms such as Shannon-Fano Coding, Huffman coding, Adaptive Huffman coding, Run Length Encoding and Arithmetic coding are considered. Lempel Ziv scheme which is a dictionary based technique is divided into two families: one derived from LZ77 (LZ77, LZSS, LZH, LZB and LZR) and the other derived from LZ78 (LZ78, LZW, LZFG, LZC and LZT). A set of interesting conclusions are derived on this basis.

Last modified: 2013-12-06 13:12:38