Papers by Swapnali Salunkhe

Hadoop Framework is used to process big data in parallel fashion. A big data is not only big in s... more Hadoop Framework is used to process big data in parallel fashion. A big data is not only big in size but also it is in different format, different size and with different speeds. To process big data relational database management system is not a suitable one. The hadoop is a most popular framework to process big data. Hadoop framework architecture has many components like name node data, data node, job tracker and task tracker. The performance of hadoop is dependent on how these components execute. The challenge in Hadoop framework is to reduce processing time of job, but these challenges are depend upon various factors like scheduling, performance of map reduce after data encryption, resource allocation, and data encryption. Proposed research is focused on how to overcome these challenge of scheduling, resource allocation, and security. Hadoop data security is also a proposed research area i.e. to find a best suitable encryption algorithm which encrypts hadoop data without affectin...

Initially Hadoop was designed without performance and security aspects. It was just used to proce... more Initially Hadoop was designed without performance and security aspects. It was just used to process big data in parallel fashion. But now a day’s user needs big data with high speed and with security features. Hadoop has some limitations while executing the job. These limitations are reduces the efficiency of hadoop and increases the job execution time. It is mostly because of the job processing method of current hadoop system, scheduling and resource allocation. The proposed system replaces the current job processing method by using Oauth token and Real time encryption algorithm. Proposed system matches a new job with previously executed jobs. If a match found then same results are displayed to user and if not then it will execute new job. If a matching rate is high then execution time will automatically decreases. The proposed system also focuses on security constraints of current hadoop system. Current hadoop system secures data while uploading and downloading it from system. Dat...

International Journal of Science and Research (IJSR), Jul 5, 2017
Hadoop Framework is used to process big data in parallel fashion. A big data is not only big in s... more Hadoop Framework is used to process big data in parallel fashion. A big data is not only big in size but also it is in different format, different size and with different speeds. To process big data relational database management system is not a suitable one. The hadoop is a most popular framework to process big data. Hadoop framework architecture has many components like name node data, data node, job tracker and task tracker. The performance of hadoop is dependent on how these components execute. The challenge in Hadoop framework is to reduce processing time of job, but these challenges are depend upon various factors like scheduling, performance of map reduce after data encryption, resource allocation, and data encryption. Proposed research is focused on how to overcome these challenge of scheduling, resource allocation, and security. Hadoop data security is also a proposed research area i.e. to find a best suitable encryption algorithm which encrypts hadoop data without affecting hadoop performance.
Uploads
Papers by Swapnali Salunkhe