ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

DE-DUPLICATION WITH ENHANCED RELIABILITY AND SECURE AUDITING IN CLOUD

Journal: International Journal of Engineering Sciences & Research Technology (IJESRT) (Vol.5, No. 3)

Publication Date:

Authors : ; ;

Page : 598-602

Keywords : KEYWORDS: Reliability; Secure De-duplication; Integrity Auditing; Distributed File System; Cloud Storage.;

Source : Downloadexternal Find it from : Google Scholarexternal

Abstract

As the distributed computing innovation creates amid the most recent decade, outsourcing information to cloud administration for capacity turns into an alluring pattern, which benefits in saving endeavors on overwhelming information upkeep and administration, In any case, subsequent to the outsourced distributed storage is not completely reliable, it raises security worries on the best way to acknowledge information de-duplication in cloud while accomplishing honesty inspecting. In this work, we concentrate on the issue of respectability reviewing and secure de-duplication on cloud information. Specifically, going for accomplishing both information respectability and de-duplication in cloud, we propose two protected frameworks, in particular Sec-Cloud and Sec-Cloud+. SecCloud presents a reviewing substance with upkeep of a Map Reduce cloud, which helps customers produce information labels before transferring and in addition review the honesty of information having been put away in cloud. Contrasted and past work, the calculation by client in Sec-Cloud is incredibly decreased amid the file transferring and examining stages. Sec-Cloud+ is planned persuaded by the way that clients constantly need to scramble their information before transferring, and empowers honesty evaluating and secure de-duplication on encoded information. Energy consumption and internet bandwidth reduced by using de-duplication because it can reduce large amount of data and also speed of data de-duplication. By eliminating the duplicate data we can serve internet bandwidth and also cost of storage gets reduced, improve the quality of searching and also helps to reduce the heavy load on the remote server. Hash value is going to use for each value then it is checked whether there is redundant value is present for the files which are uploaded on to the cloud. There are two types of duplication first is file level, in file level duplication hash value of two files is checked and if value is same that means content present in the files are same and only one file will get stored on to the cloud and next is block level, file is divided into multiple blocks and hash value of each block is checked for examining de-duplication.

Last modified: 2016-03-27 18:17:30