{"id":17079,"date":"2013-09-12T01:00:37","date_gmt":"2013-09-11T22:00:37","guid":{"rendered":"http:\/\/www.javacodegeeks.com\/?p=17079"},"modified":"2013-09-11T09:53:00","modified_gmt":"2013-09-11T06:53:00","slug":"unit-testing-a-java-hadoop-job","status":"publish","type":"post","link":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html","title":{"rendered":"Unit testing a Java Hadoop job"},"content":{"rendered":"<p>In <a href=\"http:\/\/www.javacodegeeks.com\/2013\/08\/writing-a-hadoop-mapreduce-task-in-java.html\">my previous post<\/a> I showed how to setup a complete <a href=\"http:\/\/maven.apache.org\/\">Maven<\/a> based project to create a <a href=\"http:\/\/hadoop.apache.org\/\">Hadoop<\/a> job in Java. Of course it wasn\u2019t complete because it is missing the unit test part . In this post I show how to add MapReduce unit tests to the project I started previously. For the unit test I make use of the <a href=\"http:\/\/mrunit.apache.org\/\">MRUnit framework<\/a>.<\/p>\n<ul>\n<ul>\n<li><strong>Add the necessary dependency to the pom<\/strong><\/li>\n<\/ul>\n<\/ul>\n<p>Add the following dependency to the pom:<br \/>\n&nbsp;<br \/>\n&nbsp;<br \/>\n&nbsp;<\/p>\n<pre class=\" brush:xml\">&lt;dependency&gt;\r\n   &lt;groupId&gt;org.apache.mrunit&lt;\/groupId&gt;\r\n   &lt;artifactId&gt;mrunit&lt;\/artifactId&gt;\r\n   &lt;version&gt;1.0.0&lt;\/version&gt;\r\n   &lt;classifier&gt;hadoop1&lt;\/classifier&gt;\r\n   &lt;scope&gt;test&lt;\/scope&gt;\r\n&lt;\/dependency&gt;<\/pre>\n<p>This will made the MRunit framework available to the project.<\/p>\n<ul>\n<ul>\n<li><strong>Add Unit tests for testing the Map Reduce logic<\/strong><\/li>\n<\/ul>\n<\/ul>\n<p>The use of this framework is quite straightforward, especially in our business case. So I will just show the unit test code and some comments if necessary but I think it is quite obvious how to use it. The unit test for the Mapper \u2018MapperTest\u2019:<\/p>\n<pre class=\" brush:java\">package net.pascalalma.hadoop;\r\n\r\nimport org.apache.hadoop.io.Text;\r\nimport org.apache.hadoop.mrunit.mapreduce.MapDriver;\r\nimport org.junit.Before;\r\nimport org.junit.Test;\r\nimport java.io.IOException;\r\n\r\n\/**\r\n * Created with IntelliJ IDEA.\r\n * User: pascal\r\n *\/\r\npublic class MapperTest {\r\n\r\n    MapDriver&lt;Text, Text, Text, Text&gt; mapDriver;\r\n\r\n    @Before\r\n    public void setUp() {\r\n        WordMapper mapper = new WordMapper();\r\n        mapDriver = MapDriver.newMapDriver(mapper);\r\n    }\r\n\r\n    @Test\r\n    public void testMapper() throws IOException {\r\n        mapDriver.withInput(new Text(\"a\"), new Text(\"ein\"));\r\n        mapDriver.withInput(new Text(\"a\"), new Text(\"zwei\"));\r\n        mapDriver.withInput(new Text(\"c\"), new Text(\"drei\"));\r\n        mapDriver.withOutput(new Text(\"a\"), new Text(\"ein\"));\r\n        mapDriver.withOutput(new Text(\"a\"), new Text(\"zwei\"));\r\n        mapDriver.withOutput(new Text(\"c\"), new Text(\"drei\"));\r\n        mapDriver.runTest();\r\n    }\r\n}<\/pre>\n<p>This test class is actually even simpler than the Mapper implementation itself. You just define the input of the mapper and the expected output and then let the configured MapDriver run the test. In our case the Mapper doesn\u2019t do anything specific but you see how easy it is to setup a testcase. For completeness here is the test class of the Reducer:<div style=\"display:inline-block; margin: 15px 0;\"> <div id=\"adngin-JavaCodeGeeks_incontent_video-0\" style=\"display:inline-block;\"><\/div> <\/div><\/p>\n<pre class=\" brush:java\">package net.pascalalma.hadoop;\r\n\r\nimport org.apache.hadoop.io.Text;\r\nimport org.apache.hadoop.mrunit.mapreduce.ReduceDriver;\r\nimport org.junit.Before;\r\nimport org.junit.Test;\r\nimport java.io.IOException;\r\nimport java.util.ArrayList;\r\nimport java.util.List;\r\n\r\n\/**\r\n * Created with IntelliJ IDEA.\r\n * User: pascal\r\n *\/\r\npublic class ReducerTest {\r\n\r\n    ReduceDriver&lt;Text, Text, Text, Text&gt; reduceDriver;\r\n\r\n    @Before\r\n    public void setUp() {\r\n        AllTranslationsReducer reducer = new AllTranslationsReducer();\r\n        reduceDriver = ReduceDriver.newReduceDriver(reducer);\r\n    }\r\n\r\n    @Test\r\n    public void testReducer() throws IOException {\r\n        List&lt;Text&gt; values = new ArrayList&lt;Text&gt;();\r\n        values.add(new Text(\"ein\"));\r\n        values.add(new Text(\"zwei\"));\r\n        reduceDriver.withInput(new Text(\"a\"), values);\r\n        reduceDriver.withOutput(new Text(\"a\"), new Text(\"|ein|zwei\"));\r\n        reduceDriver.runTest();\r\n    }\r\n}<\/pre>\n<ul>\n<ul>\n<li><strong>Run the unit tests it<\/strong><\/li>\n<\/ul>\n<\/ul>\n<p>With the Maven command \u201cmvn clean test\u201d we can run the tests:<\/p>\n<p><a href=\"http:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/09\/screen-shot-2013-08-23-at-20-12-50.jpg\"><img decoding=\"async\" class=\"aligncenter size-medium wp-image-17260\" alt=\"screen-shot-2013-08-23-at-20-12-50\" src=\"http:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/09\/screen-shot-2013-08-23-at-20-12-50-300x231.jpg\" width=\"300\" height=\"231\" srcset=\"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/09\/screen-shot-2013-08-23-at-20-12-50-300x231.jpg 300w, https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/09\/screen-shot-2013-08-23-at-20-12-50.jpg 925w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\n<p>With the unit tests in place I would say we are ready to build the project and deploy it to an Hadoop cluster, which I will describe in the next post.<br \/>\n&nbsp;<\/p>\n<div style=\"border: 1px solid #D8D8D8; background: #FAFAFA; width: 100%; padding-left: 5px;\"><b><i>Reference: <\/i><\/b><a href=\"http:\/\/pragmaticintegrator.wordpress.com\/2013\/08\/26\/unit-testing-a-java-hadoop-job\/\">Unit testing a Java Hadoop job<\/a> from our <a href=\"http:\/\/www.javacodegeeks.com\/jcg\">JCG partner<\/a> Pascal Alma at the <a href=\"http:\/\/pragmaticintegrator.wordpress.com\/\">The Pragmatic Integrator<\/a> blog.<\/div>\n","protected":false},"excerpt":{"rendered":"<p>In my previous post I showed how to setup a complete Maven based project to create a Hadoop job in Java. Of course it wasn\u2019t complete because it is missing the unit test part . In this post I show how to add MapReduce unit tests to the project I started previously. For the unit &hellip;<\/p>\n","protected":false},"author":366,"featured_media":17259,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[8],"tags":[184,858,372],"class_list":["post-17079","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-enterprise-java","tag-apache-hadoop","tag-apache-mrunit","tag-big-data"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Unit testing a Java Hadoop job<\/title>\n<meta name=\"description\" content=\"In my previous post I showed how to setup a complete Maven based project to create a Hadoop job in Java. Of course it wasn\u2019t complete because it is\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Unit testing a Java Hadoop job\" \/>\n<meta property=\"og:description\" content=\"In my previous post I showed how to setup a complete Maven based project to create a Hadoop job in Java. Of course it wasn\u2019t complete because it is\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html\" \/>\n<meta property=\"og:site_name\" content=\"Java Code Geeks\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/javacodegeeks\" \/>\n<meta property=\"article:published_time\" content=\"2013-09-11T22:00:37+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/09\/apache-mrunit-logo.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"150\" \/>\n\t<meta property=\"og:image:height\" content=\"150\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Pascal Alma\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@https:\/\/twitter.com\/paskal_1973\" \/>\n<meta name=\"twitter:site\" content=\"@javacodegeeks\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Pascal Alma\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"3 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html\"},\"author\":{\"name\":\"Pascal Alma\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/person\\\/a4c0bb5bfa87eb00be92c7a1d293fecf\"},\"headline\":\"Unit testing a Java Hadoop job\",\"datePublished\":\"2013-09-11T22:00:37+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html\"},\"wordCount\":282,\"commentCount\":2,\"publisher\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2013\\\/09\\\/apache-mrunit-logo.jpg\",\"keywords\":[\"Apache Hadoop\",\"Apache MRUnit\",\"Big Data\"],\"articleSection\":[\"Enterprise Java\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html\",\"name\":\"Unit testing a Java Hadoop job\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2013\\\/09\\\/apache-mrunit-logo.jpg\",\"datePublished\":\"2013-09-11T22:00:37+00:00\",\"description\":\"In my previous post I showed how to setup a complete Maven based project to create a Hadoop job in Java. Of course it wasn\u2019t complete because it is\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html#primaryimage\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2013\\\/09\\\/apache-mrunit-logo.jpg\",\"contentUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2013\\\/09\\\/apache-mrunit-logo.jpg\",\"width\":150,\"height\":150},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/09\\\/unit-testing-a-java-hadoop-job.html#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/www.javacodegeeks.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Java\",\"item\":\"https:\\\/\\\/www.javacodegeeks.com\\\/category\\\/java\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Enterprise Java\",\"item\":\"https:\\\/\\\/www.javacodegeeks.com\\\/category\\\/java\\\/enterprise-java\"},{\"@type\":\"ListItem\",\"position\":4,\"name\":\"Unit testing a Java Hadoop job\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#website\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/\",\"name\":\"Java Code Geeks\",\"description\":\"Java Developers Resource Center\",\"publisher\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#organization\"},\"alternateName\":\"JCG\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.javacodegeeks.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#organization\",\"name\":\"Exelixis Media P.C.\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2022\\\/06\\\/exelixis-logo.png\",\"contentUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2022\\\/06\\\/exelixis-logo.png\",\"width\":864,\"height\":246,\"caption\":\"Exelixis Media P.C.\"},\"image\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/javacodegeeks\",\"https:\\\/\\\/x.com\\\/javacodegeeks\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/person\\\/a4c0bb5bfa87eb00be92c7a1d293fecf\",\"name\":\"Pascal Alma\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/53ba6f041ccc86b6efd6278d4bcffecc424dc8eeaca5593acab22ae19748f5cb?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/53ba6f041ccc86b6efd6278d4bcffecc424dc8eeaca5593acab22ae19748f5cb?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/53ba6f041ccc86b6efd6278d4bcffecc424dc8eeaca5593acab22ae19748f5cb?s=96&d=mm&r=g\",\"caption\":\"Pascal Alma\"},\"description\":\"Pascal is a senior JEE Developer and Architect at 4Synergy in The Netherlands. Pascal has been designing and building J2EE applications since 2001. He is particularly interested in Open Source toolstack (Mule, Spring Framework, JBoss) and technologies like Web Services, SOA and Cloud technologies. Specialties: JEE, SOA, Mule ESB, Maven, Cloud Technology, Amazon AWS.\",\"sameAs\":[\"http:\\\/\\\/pragmaticintegrator.wordpress.com\\\/\",\"http:\\\/\\\/www.linkedin.com\\\/in\\\/pascalalma\",\"https:\\\/\\\/x.com\\\/https:\\\/\\\/twitter.com\\\/paskal_1973\"],\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/author\\\/pascal-alma\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Unit testing a Java Hadoop job","description":"In my previous post I showed how to setup a complete Maven based project to create a Hadoop job in Java. Of course it wasn\u2019t complete because it is","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html","og_locale":"en_US","og_type":"article","og_title":"Unit testing a Java Hadoop job","og_description":"In my previous post I showed how to setup a complete Maven based project to create a Hadoop job in Java. Of course it wasn\u2019t complete because it is","og_url":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html","og_site_name":"Java Code Geeks","article_publisher":"https:\/\/www.facebook.com\/javacodegeeks","article_published_time":"2013-09-11T22:00:37+00:00","og_image":[{"width":150,"height":150,"url":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/09\/apache-mrunit-logo.jpg","type":"image\/jpeg"}],"author":"Pascal Alma","twitter_card":"summary_large_image","twitter_creator":"@https:\/\/twitter.com\/paskal_1973","twitter_site":"@javacodegeeks","twitter_misc":{"Written by":"Pascal Alma","Est. reading time":"3 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html#article","isPartOf":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html"},"author":{"name":"Pascal Alma","@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/person\/a4c0bb5bfa87eb00be92c7a1d293fecf"},"headline":"Unit testing a Java Hadoop job","datePublished":"2013-09-11T22:00:37+00:00","mainEntityOfPage":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html"},"wordCount":282,"commentCount":2,"publisher":{"@id":"https:\/\/www.javacodegeeks.com\/#organization"},"image":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html#primaryimage"},"thumbnailUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/09\/apache-mrunit-logo.jpg","keywords":["Apache Hadoop","Apache MRUnit","Big Data"],"articleSection":["Enterprise Java"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html","url":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html","name":"Unit testing a Java Hadoop job","isPartOf":{"@id":"https:\/\/www.javacodegeeks.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html#primaryimage"},"image":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html#primaryimage"},"thumbnailUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/09\/apache-mrunit-logo.jpg","datePublished":"2013-09-11T22:00:37+00:00","description":"In my previous post I showed how to setup a complete Maven based project to create a Hadoop job in Java. Of course it wasn\u2019t complete because it is","breadcrumb":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html#primaryimage","url":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/09\/apache-mrunit-logo.jpg","contentUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/09\/apache-mrunit-logo.jpg","width":150,"height":150},{"@type":"BreadcrumbList","@id":"https:\/\/www.javacodegeeks.com\/2013\/09\/unit-testing-a-java-hadoop-job.html#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.javacodegeeks.com\/"},{"@type":"ListItem","position":2,"name":"Java","item":"https:\/\/www.javacodegeeks.com\/category\/java"},{"@type":"ListItem","position":3,"name":"Enterprise Java","item":"https:\/\/www.javacodegeeks.com\/category\/java\/enterprise-java"},{"@type":"ListItem","position":4,"name":"Unit testing a Java Hadoop job"}]},{"@type":"WebSite","@id":"https:\/\/www.javacodegeeks.com\/#website","url":"https:\/\/www.javacodegeeks.com\/","name":"Java Code Geeks","description":"Java Developers Resource Center","publisher":{"@id":"https:\/\/www.javacodegeeks.com\/#organization"},"alternateName":"JCG","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.javacodegeeks.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.javacodegeeks.com\/#organization","name":"Exelixis Media P.C.","url":"https:\/\/www.javacodegeeks.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/logo\/image\/","url":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2022\/06\/exelixis-logo.png","contentUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2022\/06\/exelixis-logo.png","width":864,"height":246,"caption":"Exelixis Media P.C."},"image":{"@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/javacodegeeks","https:\/\/x.com\/javacodegeeks"]},{"@type":"Person","@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/person\/a4c0bb5bfa87eb00be92c7a1d293fecf","name":"Pascal Alma","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/53ba6f041ccc86b6efd6278d4bcffecc424dc8eeaca5593acab22ae19748f5cb?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/53ba6f041ccc86b6efd6278d4bcffecc424dc8eeaca5593acab22ae19748f5cb?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/53ba6f041ccc86b6efd6278d4bcffecc424dc8eeaca5593acab22ae19748f5cb?s=96&d=mm&r=g","caption":"Pascal Alma"},"description":"Pascal is a senior JEE Developer and Architect at 4Synergy in The Netherlands. Pascal has been designing and building J2EE applications since 2001. He is particularly interested in Open Source toolstack (Mule, Spring Framework, JBoss) and technologies like Web Services, SOA and Cloud technologies. Specialties: JEE, SOA, Mule ESB, Maven, Cloud Technology, Amazon AWS.","sameAs":["http:\/\/pragmaticintegrator.wordpress.com\/","http:\/\/www.linkedin.com\/in\/pascalalma","https:\/\/x.com\/https:\/\/twitter.com\/paskal_1973"],"url":"https:\/\/www.javacodegeeks.com\/author\/pascal-alma"}]}},"_links":{"self":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/posts\/17079","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/users\/366"}],"replies":[{"embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/comments?post=17079"}],"version-history":[{"count":0,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/posts\/17079\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/media\/17259"}],"wp:attachment":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/media?parent=17079"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/categories?post=17079"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/tags?post=17079"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}