Medical application of privacy preservation by big data analysis using Hadoop map reduce framework

Ashish P, Tejas S, Srinivasa J G and Sumeet G

The Map Reduce framework has become the de-facto framework for large-scale data analysis and
data mining. The computer industry is being challenged to develop methods and techniques for
affordable data processing on large datasets at optimum response times. The technical challenges in
dealing with the increasing demand to handle vast quantities of data is daunting and on the rise. One
of the recent processing models with a more efficient and intuitive solution to rapidly process large
amount of data in parallel is called Map Reduce. It is a framework defining a template approach of
programming to perform large-scale data computation on clusters of machines in a cloud computing
environment. Map Reduce provides automatic parallelization and distribution of computation based
on several processors. It hides the complexity of writing parallel and distributed programming code

Download PDF: