{"id":13933,"date":"2013-06-11T19:00:27","date_gmt":"2013-06-11T16:00:27","guid":{"rendered":"http:\/\/www.javacodegeeks.com\/?p=13933"},"modified":"2013-06-11T22:55:04","modified_gmt":"2013-06-11T19:55:04","slug":"setting-up-apache-hadoop-multi-node-cluster","status":"publish","type":"post","link":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html","title":{"rendered":"Setting up Apache Hadoop Multi &#8211; Node Cluster"},"content":{"rendered":"<p>We are sharing our experience about Apache Hadoop Installation in Linux based machines (Multi-node). Here we will also share our experience about different troubleshooting also and make update in future.<\/p>\n<h2>User creation and other configurations step\u00a0&#8211;<\/h2>\n<ul>\n<li>We start by adding a dedicated Hadoop system user in each cluster.<\/li>\n<\/ul>\n<p>&nbsp;<br \/>\n&nbsp;<br \/>\n&nbsp;<\/p>\n<pre class=\" brush:bash\">$ sudo addgroup hadoop\r\n$ sudo adduser \u2013ingroup hadoop hduser<\/pre>\n<ul>\n<li>Next we configure the SSH (Secure Shell) on all the cluster to enable secure data communication.<\/li>\n<\/ul>\n<pre class=\" brush:bash\">user@node1:~$ su \u2013 hduser\r\nhduser@node1:~$ ssh-keygen -t rsa -P \u201c\u201d<\/pre>\n<p>The output will be something like the following:<\/p>\n<pre class=\" brush:bash\">Generating public\/private rsa key pair.\r\nEnter file in which to save the key (\/home\/hduser\/.ssh\/id_rsa):\r\nCreated directory '\/home\/hduser\/.ssh'.\r\nYour identification has been saved in \/home\/hduser\/.ssh\/id_rsa.\r\nYour public key has been saved in \/home\/hduser\/.ssh\/id_rsa.pub.\r\nThe key fingerprint is:\r\n9b:82:ea:58:b4:e0:35:d7:ff:19:66:a6:ef:ae:0e:d2 hduser@ubuntu\r\n.....<\/pre>\n<ul>\n<li>Next we need to enable SSH access to local machine with this newly created key:<\/li>\n<\/ul>\n<pre class=\" brush:bash\">hduser@node1:~$ cat $HOME\/.ssh\/id_rsa.pub &gt;&gt; $HOME\/.ssh\/authorized_keys<\/pre>\n<p>Repeat the above steps in all the cluster nodes and test by executing the following statement<\/p>\n<pre class=\" brush:bash\">hduser@node1:~$ ssh localhost<\/pre>\n<p>This step is also needed to save local machine\u2019s host key fingerprint to the hduser user\u2019s known_hosts file.<\/p>\n<p>Next we need to edit the \/etc\/hosts file in which we put the IPs and Name of each system in the cluster.<\/p>\n<p>In our scenario we have one master (with IP 192.168.0.100) and one slave (with IP 192.168.0.101)<\/p>\n<pre class=\" brush:bash\">$ sudo vi \/etc\/hosts<\/pre>\n<p>and we put the values into the host file as key value pair.<\/p>\n<pre class=\" brush:bash\"> \r\n192.168.0.100 master\r\n192.168.0.101 slave\r\n<\/pre>\n<ul>\n<li>Providing the SSH Access<\/li>\n<\/ul>\n<p>The hduser user on the master node must be able to connect<\/p>\n<ol>\n<ol>\n<li>to its own user account on the master via ssh master in this context not necessarily ssh localhost.<\/li>\n<li>to the hduser account of the slave(s) via a password-less SSH login.<\/li>\n<\/ol>\n<\/ol>\n<p>So we distribute the SSH public key of hduser@master to all its slave, (in our case we have only one slave. If you have more execute the following statement changing the machine name i.e. slave, slave1, slave2).<\/p>\n<pre class=\" brush:bash\">hduser@master:~$ ssh-copy-id -i $HOME\/.ssh\/id_rsa.pub hduser@slave<\/pre>\n<p>Try by connecting master to master and master to slave(s) and check if everything is fine.<div style=\"display:inline-block; margin: 15px 0;\"> <div id=\"adngin-JavaCodeGeeks_incontent_video-0\" style=\"display:inline-block;\"><\/div> <\/div><\/p>\n<h2>Configuring Hadoop<\/h2>\n<ul>\n<li>Let us edit the <b>conf\/masters <\/b>(only in the masters node)<\/li>\n<\/ul>\n<p>and we enter <b>master <\/b>into the file.<\/p>\n<p>Doing this we have told Hadoop that start Namenode and\u00a0<i>secondary NameNodes<\/i> in our multi-node cluster in this machine.<\/p>\n<p>The <i>primary NameNode <\/i>and the <i>JobTracker <\/i>will always be on the machine we run <b>bin\/start-dfs.sh <\/b>and <b>bin\/start-mapred.sh<\/b>.<\/p>\n<ul>\n<li>Let us now edit the <b>conf\/slaves<\/b>(only in the masters node) with<\/li>\n<\/ul>\n<pre class=\" brush:bash\">master\r\nslave<\/pre>\n<p>This means that, we try to run datanode process on master machine also \u2013 where the namenode is also running. We can leave <strong>master\u00a0<\/strong>to act as slave if we have more machines as datanode at our disposal.<\/p>\n<p>if we have more slaves, then to add one host per line like the following:<\/p>\n<pre class=\" brush:bash\">master\r\nslave\r\nslave2\r\nslave3<\/pre>\n<p>etc\u2026.<\/p>\n<p>Lets now edit two important files (<b>in all the nodes <\/b>in our cluster):<\/p>\n<ol>\n<li><strong>conf\/core-site.xml<\/strong><\/li>\n<li><strong>conf\/core-hdfs.xml<\/strong><\/li>\n<\/ol>\n<p><strong>1) conf\/core-site.xml<\/strong><\/p>\n<p>We have to change the <i>fs.default.parameter <\/i>which specifies NameNode host and port. (In our case this is the master machine)<\/p>\n<pre class=\"brush:xml\">&lt;property&gt;\r\n\r\n&lt;name&gt;fs.default.name&lt;\/name&gt;\r\n&lt;value&gt;hdfs:\/\/master:54310&lt;\/value&gt;\r\n\r\n\u2026..[Other XML Values]\r\n\r\n&lt;\/property&gt;<\/pre>\n<p>Create a directory into which Hadoop will store its data &#8211;<\/p>\n<pre class=\"brush:bash\">$ mkdir \/app\/hadoop<\/pre>\n<p>We have to ensure the directory is writeable by any user:<\/p>\n<pre class=\"brush:bash\">$ chmod 777\u00a0\/app\/hadoop<\/pre>\n<p>Modify core-site.xml once again to add the following property:<\/p>\n<pre class=\"brush:xml\">&lt;property&gt;\r\n&lt;name&gt;hadoop.tmp.dir&lt;\/name&gt;\r\n&lt;value&gt;\/app\/hadoop&lt;\/value&gt;\r\n&lt;\/property&gt;<\/pre>\n<p><strong>2) conf\/core-hdfs.xml<\/strong><\/p>\n<p>We have to change the <i>dfs.replication<\/i> parameter which specifies default block replication. It defines how many machines a single file should be replicated to before it becomes available. If we set this to a value higher than the number of available slave nodes (more precisely, the number of DataNodes), we will start seeing a lot of \u201c(Zero targets found, forbidden1.size=1)\u201d type errors in the log files.<\/p>\n<p>The default value of dfs.replication is 3. However, as we have only two nodes available (in our scenario), so we set dfs.replication to 2.<\/p>\n<pre class=\"brush:xml\">&lt;property&gt;\r\n&lt;name&gt;dfs.replication&lt;\/name&gt;\r\n&lt;value&gt;2&lt;\/value&gt;\r\n\u2026..[Other XML Values]\r\n&lt;\/property&gt;<\/pre>\n<ul>\n<li>Let us\u00a0<b>format<\/b> the HDFS File System via NameNode.<\/li>\n<\/ul>\n<p>Run the following command at <b>master<\/b><\/p>\n<pre class=\"brush:bash\">bin\/hadoop namenode -format<\/pre>\n<ul>\n<li>Let us\u00a0<b>start <\/b>the multi node cluster:<\/li>\n<\/ul>\n<p>Run the command: (in our case we will run on the machine named as master)<\/p>\n<pre class=\"brush:bash\">bin\/start-dfs.sh<\/pre>\n<h2>Checking of Hadoop Status\u00a0&#8211;<\/h2>\n<p>After everything has started run the jps command on all the nodes to see everything is running well or not.<\/p>\n<p>In master node the desired output will be \u00a0&#8211;<\/p>\n<pre class=\"brush:bash\">$ jps\r\n\r\n14799 NameNode\r\n15314 Jps\r\n14880 DataNode\r\n14977 SecondaryNameNode<\/pre>\n<p>In Slave(s):<\/p>\n<pre class=\"brush:bash\">$ jps\r\n15314 Jps\r\n14880 DataNode<\/pre>\n<p>Ofcourse the Process IDs will vary from machine to machine.<\/p>\n<h2>Troubleshooting<\/h2>\n<p>It might be possible that Datanode might not get started in all our nodes. At this point if we see the<\/p>\n<pre class=\"brush:bash\">logs\/hadoop-hduser-datanode-.log <\/pre>\n<p>on the effected nodes with the exception &#8211;<\/p>\n<pre class=\"brush:bash\">java.io.IOException: Incompatible namespaceIDs<\/pre>\n<p>In this case we need to do the following &#8211;<\/p>\n<ol>\n<li>Stop the full cluster, i.e. both MapReduce and HDFS layers.<\/li>\n<li>Delete the data directory on the problematic DataNode: the directory is specified by <strong>dfs.data.dir<\/strong> in <strong>conf\/hdfs-site.xml.<\/strong>\u00a0In our case, the relevant directory is\u00a0\/app\/hadoop\/tmp\/dfs\/data<\/li>\n<li>Reformat the NameNode. <b>All HDFS data will be lost during the format perocess.<\/b><\/li>\n<li>Restart the cluster.<\/li>\n<\/ol>\n<p>Or<\/p>\n<p>We can manually update the namespaceID of problematic DataNodes:<\/p>\n<ol>\n<li>Stop the problematic DataNode(s).<\/li>\n<li>Edit the value of namespaceID in ${dfs.data.dir}\/current\/VERSION to match the corresponding value of the current NameNode in ${dfs.name.dir}\/current\/VERSION.<\/li>\n<li>Restart the fixed DataNode(s).<\/li>\n<\/ol>\n<p>In\u00a0<strong><a title=\"Running Map-Reduce Job in Apache Hadoop (Multinode Cluster)\" href=\"http:\/\/www.javacodegeeks.com\/2013\/06\/running-map-reduce-job-in-apache-hadoop-multinode-cluster.html\" target=\"_blank\">Running Map-Reduce Job in Apache Hadoop (Multinode Cluster)<\/a><\/strong>, we will share our experience about Map Reduce Job Running as per apache hadoop example.<\/p>\n<h4>Resources<\/h4>\n<ul>\n<li><a href=\"http:\/\/www.michael-noll.com\/tutorials\/running-hadoop-on-ubuntu-linux-single-node-cluster\/\">http:\/\/www.michael-noll.com\/tutorials\/running-hadoop-on-ubuntu-linux-single-node-cluster\/<\/a><\/li>\n<li><a href=\"http:\/\/www.michael-noll.com\/tutorials\/running-hadoop-on-ubuntu-linux-multi-node-cluster\/\">http:\/\/www.michael-noll.com\/tutorials\/running-hadoop-on-ubuntu-linux-multi-node-cluster\/<\/a><\/li>\n<li><a href=\"http:\/\/hadoop.apache.org\/docs\/current\/\">http:\/\/hadoop.apache.org\/docs\/current\/<\/a><\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<div style=\"border: 1px solid #D8D8D8; background: #FAFAFA; width: 100%; padding-left: 5px;\"><b><i>Reference: <\/i><\/b><a href=\"http:\/\/www.phloxblog.in\/setting-hadoop-multi-node-cluster\/\">Setting up Apache Hadoop Multi &#8211; Node Cluster<\/a> from our <a href=\"http:\/\/www.javacodegeeks.com\/jcg\">JCG partner<\/a> Piyas De at the <a href=\"http:\/\/www.phloxblog.in\">Phlox Blog<\/a> blog.<\/div>\n","protected":false},"excerpt":{"rendered":"<p>We are sharing our experience about Apache Hadoop Installation in Linux based machines (Multi-node). Here we will also share our experience about different troubleshooting also and make update in future. User creation and other configurations step\u00a0&#8211; We start by adding a dedicated Hadoop system user in each cluster. &nbsp; &nbsp; &nbsp; $ sudo addgroup hadoop &hellip;<\/p>\n","protected":false},"author":448,"featured_media":62,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[8],"tags":[184,372],"class_list":["post-13933","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-enterprise-java","tag-apache-hadoop","tag-big-data"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Setting up Apache Hadoop Multi - Node Cluster<\/title>\n<meta name=\"description\" content=\"We are sharing our experience about Apache Hadoop Installation in Linux based machines (Multi-node). Here we will also share our experience about\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Setting up Apache Hadoop Multi - Node Cluster\" \/>\n<meta property=\"og:description\" content=\"We are sharing our experience about Apache Hadoop Installation in Linux based machines (Multi-node). Here we will also share our experience about\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html\" \/>\n<meta property=\"og:site_name\" content=\"Java Code Geeks\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/javacodegeeks\" \/>\n<meta property=\"article:author\" content=\"http:\/\/www.facebook.com\/phlocblogger\" \/>\n<meta property=\"article:published_time\" content=\"2013-06-11T16:00:27+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2013-06-11T19:55:04+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/apache-hadoop-logo.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"150\" \/>\n\t<meta property=\"og:image:height\" content=\"150\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Piyas De\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@https:\/\/twitter.com\/phloxblog\" \/>\n<meta name=\"twitter:site\" content=\"@javacodegeeks\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Piyas De\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"5 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html\"},\"author\":{\"name\":\"Piyas De\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/person\\\/20f3c9ff4b90d43da03decd2ad2b4f37\"},\"headline\":\"Setting up Apache Hadoop Multi &#8211; Node Cluster\",\"datePublished\":\"2013-06-11T16:00:27+00:00\",\"dateModified\":\"2013-06-11T19:55:04+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html\"},\"wordCount\":879,\"commentCount\":7,\"publisher\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2012\\\/10\\\/apache-hadoop-logo.jpg\",\"keywords\":[\"Apache Hadoop\",\"Big Data\"],\"articleSection\":[\"Enterprise Java\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html\",\"name\":\"Setting up Apache Hadoop Multi - Node Cluster\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2012\\\/10\\\/apache-hadoop-logo.jpg\",\"datePublished\":\"2013-06-11T16:00:27+00:00\",\"dateModified\":\"2013-06-11T19:55:04+00:00\",\"description\":\"We are sharing our experience about Apache Hadoop Installation in Linux based machines (Multi-node). Here we will also share our experience about\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html#primaryimage\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2012\\\/10\\\/apache-hadoop-logo.jpg\",\"contentUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2012\\\/10\\\/apache-hadoop-logo.jpg\",\"width\":150,\"height\":150},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/06\\\/setting-up-apache-hadoop-multi-node-cluster.html#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/www.javacodegeeks.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Java\",\"item\":\"https:\\\/\\\/www.javacodegeeks.com\\\/category\\\/java\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Enterprise Java\",\"item\":\"https:\\\/\\\/www.javacodegeeks.com\\\/category\\\/java\\\/enterprise-java\"},{\"@type\":\"ListItem\",\"position\":4,\"name\":\"Setting up Apache Hadoop Multi &#8211; Node Cluster\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#website\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/\",\"name\":\"Java Code Geeks\",\"description\":\"Java Developers Resource Center\",\"publisher\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#organization\"},\"alternateName\":\"JCG\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.javacodegeeks.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#organization\",\"name\":\"Exelixis Media P.C.\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2022\\\/06\\\/exelixis-logo.png\",\"contentUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2022\\\/06\\\/exelixis-logo.png\",\"width\":864,\"height\":246,\"caption\":\"Exelixis Media P.C.\"},\"image\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/javacodegeeks\",\"https:\\\/\\\/x.com\\\/javacodegeeks\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/person\\\/20f3c9ff4b90d43da03decd2ad2b4f37\",\"name\":\"Piyas De\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/eadd6728b7b5be23f0d6585da1a953926e49c6f2369703d6cb4f1147d4dd2203?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/eadd6728b7b5be23f0d6585da1a953926e49c6f2369703d6cb4f1147d4dd2203?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/eadd6728b7b5be23f0d6585da1a953926e49c6f2369703d6cb4f1147d4dd2203?s=96&d=mm&r=g\",\"caption\":\"Piyas De\"},\"description\":\"Piyas is Sun Microsystems certified Enterprise Architect with 10+ years of professional IT experience in various areas such as Architecture Definition, Define Enterprise Application, Client-server\\\/e-business solutions.Currently he is engaged in providing solutions for digital asset management in media companies.He is also founder and main author of \\\"Technical Blogs(Blog about small technical Know hows)\\\" Hyperlink - http:\\\/\\\/www.phloxblog.in\",\"sameAs\":[\"http:\\\/\\\/www.phloxblog.in\",\"http:\\\/\\\/www.facebook.com\\\/phlocblogger\",\"http:\\\/\\\/in.linkedin.com\\\/in\\\/piyasde\",\"https:\\\/\\\/x.com\\\/https:\\\/\\\/twitter.com\\\/phloxblog\"],\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/author\\\/piyas-de\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Setting up Apache Hadoop Multi - Node Cluster","description":"We are sharing our experience about Apache Hadoop Installation in Linux based machines (Multi-node). Here we will also share our experience about","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html","og_locale":"en_US","og_type":"article","og_title":"Setting up Apache Hadoop Multi - Node Cluster","og_description":"We are sharing our experience about Apache Hadoop Installation in Linux based machines (Multi-node). Here we will also share our experience about","og_url":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html","og_site_name":"Java Code Geeks","article_publisher":"https:\/\/www.facebook.com\/javacodegeeks","article_author":"http:\/\/www.facebook.com\/phlocblogger","article_published_time":"2013-06-11T16:00:27+00:00","article_modified_time":"2013-06-11T19:55:04+00:00","og_image":[{"width":150,"height":150,"url":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/apache-hadoop-logo.jpg","type":"image\/jpeg"}],"author":"Piyas De","twitter_card":"summary_large_image","twitter_creator":"@https:\/\/twitter.com\/phloxblog","twitter_site":"@javacodegeeks","twitter_misc":{"Written by":"Piyas De","Est. reading time":"5 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html#article","isPartOf":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html"},"author":{"name":"Piyas De","@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/person\/20f3c9ff4b90d43da03decd2ad2b4f37"},"headline":"Setting up Apache Hadoop Multi &#8211; Node Cluster","datePublished":"2013-06-11T16:00:27+00:00","dateModified":"2013-06-11T19:55:04+00:00","mainEntityOfPage":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html"},"wordCount":879,"commentCount":7,"publisher":{"@id":"https:\/\/www.javacodegeeks.com\/#organization"},"image":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html#primaryimage"},"thumbnailUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/apache-hadoop-logo.jpg","keywords":["Apache Hadoop","Big Data"],"articleSection":["Enterprise Java"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html","url":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html","name":"Setting up Apache Hadoop Multi - Node Cluster","isPartOf":{"@id":"https:\/\/www.javacodegeeks.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html#primaryimage"},"image":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html#primaryimage"},"thumbnailUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/apache-hadoop-logo.jpg","datePublished":"2013-06-11T16:00:27+00:00","dateModified":"2013-06-11T19:55:04+00:00","description":"We are sharing our experience about Apache Hadoop Installation in Linux based machines (Multi-node). Here we will also share our experience about","breadcrumb":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html#primaryimage","url":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/apache-hadoop-logo.jpg","contentUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/apache-hadoop-logo.jpg","width":150,"height":150},{"@type":"BreadcrumbList","@id":"https:\/\/www.javacodegeeks.com\/2013\/06\/setting-up-apache-hadoop-multi-node-cluster.html#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.javacodegeeks.com\/"},{"@type":"ListItem","position":2,"name":"Java","item":"https:\/\/www.javacodegeeks.com\/category\/java"},{"@type":"ListItem","position":3,"name":"Enterprise Java","item":"https:\/\/www.javacodegeeks.com\/category\/java\/enterprise-java"},{"@type":"ListItem","position":4,"name":"Setting up Apache Hadoop Multi &#8211; Node Cluster"}]},{"@type":"WebSite","@id":"https:\/\/www.javacodegeeks.com\/#website","url":"https:\/\/www.javacodegeeks.com\/","name":"Java Code Geeks","description":"Java Developers Resource Center","publisher":{"@id":"https:\/\/www.javacodegeeks.com\/#organization"},"alternateName":"JCG","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.javacodegeeks.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.javacodegeeks.com\/#organization","name":"Exelixis Media P.C.","url":"https:\/\/www.javacodegeeks.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/logo\/image\/","url":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2022\/06\/exelixis-logo.png","contentUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2022\/06\/exelixis-logo.png","width":864,"height":246,"caption":"Exelixis Media P.C."},"image":{"@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/javacodegeeks","https:\/\/x.com\/javacodegeeks"]},{"@type":"Person","@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/person\/20f3c9ff4b90d43da03decd2ad2b4f37","name":"Piyas De","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/eadd6728b7b5be23f0d6585da1a953926e49c6f2369703d6cb4f1147d4dd2203?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/eadd6728b7b5be23f0d6585da1a953926e49c6f2369703d6cb4f1147d4dd2203?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/eadd6728b7b5be23f0d6585da1a953926e49c6f2369703d6cb4f1147d4dd2203?s=96&d=mm&r=g","caption":"Piyas De"},"description":"Piyas is Sun Microsystems certified Enterprise Architect with 10+ years of professional IT experience in various areas such as Architecture Definition, Define Enterprise Application, Client-server\/e-business solutions.Currently he is engaged in providing solutions for digital asset management in media companies.He is also founder and main author of \"Technical Blogs(Blog about small technical Know hows)\" Hyperlink - http:\/\/www.phloxblog.in","sameAs":["http:\/\/www.phloxblog.in","http:\/\/www.facebook.com\/phlocblogger","http:\/\/in.linkedin.com\/in\/piyasde","https:\/\/x.com\/https:\/\/twitter.com\/phloxblog"],"url":"https:\/\/www.javacodegeeks.com\/author\/piyas-de"}]}},"_links":{"self":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/posts\/13933","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/users\/448"}],"replies":[{"embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/comments?post=13933"}],"version-history":[{"count":0,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/posts\/13933\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/media\/62"}],"wp:attachment":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/media?parent=13933"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/categories?post=13933"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/tags?post=13933"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}