BE/BTech & ME/MTech Final Year Projects for Computer Science | Information Technology | ECE Engineer | IEEE Projects Topics, PHD Projects Reports, Ideas and Download | Sai Info Solution | Nashik |Pune |Mumbai
director@saiinfo settings_phone02536644344 settings_phone+919270574718 +919096813348 settings_phone+917447889268
logo


SAI INFO SOLUTION


Diploma | BE |B.Tech |ME | M.Tech |PHD

Project Development and Training

Search Project by Domainwise


An Efficient Data Duplication System based on Hadoop Distributed File System


Scalable and Secure Big Data I

3D Reconstruction in Canonical

Class Agnostic Image Common Ob
Abstract


HDFS [Hadoop Distributed File System] a part of Apache Hadoop to store large data set consistently. HDFS is used for process Massive-Scale Data in parallel and it ensures accessibility of facts by replicating data to different nodes. Still, the repetition policy of HDFS doesn't think about the name of knowledge. The recognition of the files tends to alter over time. Hence, maintaining a fixed replication issue can affect the storage efficiency of HDFS. An Efficient Data Duplication System Based on HDFS, is proposed which consider the reputations of the records set aside in HDFS before replication. The proposed technique successfully reduces storage consumption by up to 45% without moving the accessibility and fault recognition in HDFS.

KeyWords
Data Locality, Data Duplication, Hadoop, Access Predication.



Share
Share via WhatsApp
BE/BTech & ME/MTech Final Year Projects for Computer Science | Information Technology | ECE Engineer | IEEE Projects Topics, PHD Projects Reports, Ideas and Download | Sai Info Solution | Nashik |Pune |Mumbai
Call us : 09096813348 / 02536644344
Mail ID : developer.saiinfo@gmail.com
Skype ID : saiinfosolutionnashik