ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

Implementation of Image Processing System using Handover Technique with Map Reduce Based on Big Data in the Cloud Environment

Journal: The International Arab Journal of Information Technology (Vol.13, No. 2)

Publication Date:

Authors : ; ;

Page : 326-331

Keywords : Cloud computing; big data; HDFS; map reduce; DHRF algorithm.;

Source : Downloadexternal Find it from : Google Scholarexternal

Abstract

Cloud computing is the one of the emerging techniques to process the big data. Cloud computing is also, known as service on demand. Large set or large volume of data is known as big data. Processing big data (MRI images and DICOM images) normally takes more time. Hard tasks such as handling big data can be solved by using the concepts of hadoop. Enhancing the hadoop concept will help the user to process the large set of images. The Hadoop Distributed File System (HDFS) and Map Reduce are the two default main functions which is used to enhance hadoop. HDFS is a hadoop file storing system, which is used for storing and retrieving the data. Map Reduce is the combination of two functions namely maps and reduces. Map is the process of splitting the inputs and reduce is the process of integrating the output of map's input. Recently,medical experts experienced problems like machine failure and fault tolerance while processing the result for the scanned data. A unique optimized time scheduling algorithm, called Dynamic Handover Reduce Function (DHRF) algorithm is introduced in the reduce function. Enhancement of hadoop and cloud and introduction of DHRF helps to overcome the processing risks, to get optimized result with less waiting time and reduction in error percentage of the output image

Last modified: 2019-11-13 20:54:15