Utvidet returrett til 31. januar 2025

A Hadoop Based Framework for Secure Medical Datasets

Om A Hadoop Based Framework for Secure Medical Datasets

The development in medical field leads to produce massive amount of medical data. In 2002, more than 12000 images a day were produced by the Department of Radiology of a hospital in Geneva. The medical datasets are available for further exploration and research which have far reaching impact on the progress and execution of health programs. The information archived from exploring medical datasets paves the way for health administration, e-health diagnosis and therapy. So, there is urgent need to accentuate the research in medical data. The medical data is a huge growing industry and its size normally lies in terabytes. Such a big data puts forward many challenges and issues due to its large volume, variety, velocity, value and variability. Moreover, the working of traditional file management systems is slowing down due to its incapability of managing unstructured, variable and complex big data. The managing of such big data is very cumbersome and time-consuming task which requires new computing techniques. So, the exponential growth of medical data has necessitated a paradigm shift in the way the data is managed and processed. The recent technological advancements influenced the way of storing and processing big data. This motivated us to think about finding new solutions for managing volumetric medical datasets and to obtain valuable information efficiently. Hadoop is a top-level Apache project and is written in Java. Hadoop was developed by Doug Cutting as a collection of open-source projects. It is presently used on massive amount of unstructured data. With Hadoop, the data can be harnessed that was previously difficult to analyze. Hadoop has the ability to process extremely large data with changing structure. Hadoop is composed of different modules like HBase, Pig, HCatalog, Hive, Zookeeper, Oozie and Kafka, but the most common paradigms for big data are Hadoop Distributed File sSystem (HDFS) and MapReduce.

Vis mer
  • Språk:
  • Engelsk
  • ISBN:
  • 9798224231140
  • Bindende:
  • Paperback
  • Sider:
  • 124
  • Utgitt:
  • 10. februar 2024
  • Dimensjoner:
  • 216x8x280 mm.
  • Vekt:
  • 334 g.
  • BLACK NOVEMBER
  Gratis frakt
Leveringstid: 2-4 uker
Forventet levering: 6. desember 2024

Beskrivelse av A Hadoop Based Framework for Secure Medical Datasets

The development in medical field leads to produce massive amount of medical data. In 2002, more than 12000 images a day were produced by the Department of Radiology of a hospital in Geneva. The medical datasets are available for further exploration and research which have far reaching impact on the progress and execution of health programs. The information archived from exploring medical datasets paves the way for health administration, e-health diagnosis and therapy. So, there is urgent need to accentuate the research in medical data.

The medical data is a huge growing industry and its size normally lies in terabytes. Such a big data puts forward many challenges and issues due to its large volume, variety, velocity, value and variability. Moreover, the working of traditional file management systems is slowing down due to its incapability of managing unstructured, variable and complex big data. The managing of such big data is very cumbersome and time-consuming task which requires new computing techniques. So, the exponential growth of medical data has necessitated a paradigm shift in the way the data is managed and processed. The recent technological advancements influenced the way of storing and processing big data. This motivated us to think about finding new solutions for managing volumetric medical datasets and to obtain valuable information efficiently.

Hadoop is a top-level Apache project and is written in Java. Hadoop was developed by Doug Cutting as a collection of open-source projects. It is presently used on massive amount of unstructured data. With Hadoop, the data can be harnessed that was previously difficult to analyze. Hadoop has the ability to process extremely large data with changing structure. Hadoop is composed of different modules like HBase, Pig, HCatalog, Hive, Zookeeper, Oozie and Kafka, but the most common paradigms for big data are Hadoop Distributed File sSystem (HDFS) and MapReduce.

Brukervurderinger av A Hadoop Based Framework for Secure Medical Datasets



Finn lignende bøker
Boken A Hadoop Based Framework for Secure Medical Datasets finnes i følgende kategorier:

Gjør som tusenvis av andre bokelskere

Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.