{"id":19356,"date":"2013-12-03T13:00:31","date_gmt":"2013-12-03T11:00:31","guid":{"rendered":"http:\/\/www.javacodegeeks.com\/?p=19356"},"modified":"2013-12-02T22:22:34","modified_gmt":"2013-12-02T20:22:34","slug":"mongodb-facts-80000-insertssecond-on-commodity-hardware","status":"publish","type":"post","link":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html","title":{"rendered":"MongoDB Facts: 80000+ inserts\/second on commodity hardware"},"content":{"rendered":"<p>While experimenting with some time series collections I needed a large data set to check that our aggregation queries don\u2019t become a bottleneck in case of increasing data loads. We settled for 50 million documents, since beyond this number we would consider sharding anyway.<\/p>\n<p>Each time event looks like this:<br \/>\n&nbsp;<br \/>\n&nbsp;<br \/>\n&nbsp;<br \/>\n&nbsp;<br \/>\n&nbsp;<\/p>\n<pre class=\" brush:java\">{\r\n        \"_id\" : ObjectId(\"5298a5a03b3f4220588fe57c\"),\r\n        \"created_on\" : ISODate(\"2012-04-22T01:09:53Z\"),\r\n        \"value\" : 0.1647851116706831\r\n}<\/pre>\n<p>As we wanted to get random values, we thought of generating them using JavaScript or Python (we could have tried in in Java, but we wanted to write it as fast as possible). We didn\u2019t know which one will be faster so we decided to test them.<\/p>\n<p>Our first try was with a JavaScript file run through the MongoDB shell.<\/p>\n<p>Here is how it looks like:<\/p>\n<pre class=\" brush:java\">var minDate = new Date(2012, 0, 1, 0, 0, 0, 0);\r\nvar maxDate = new Date(2013, 0, 1, 0, 0, 0, 0);\r\nvar delta = maxDate.getTime() - minDate.getTime();\r\n\r\nvar job_id = arg2;\r\n\r\nvar documentNumber = arg1;\r\nvar batchNumber = 5 * 1000;\r\n\r\nvar job_name = 'Job#' + job_id\r\nvar start = new Date();\r\n\r\nvar batchDocuments = new Array();\r\nvar index = 0;\r\n\r\nwhile(index &lt; documentNumber) {\r\n\tvar date = new Date(minDate.getTime() + Math.random() * delta);\r\n\tvar value = Math.random();\r\n\tvar document = {\t\t\r\n\t\tcreated_on : date,\r\n\t\tvalue : value\r\n\t};\r\n\tbatchDocuments[index % batchNumber] = document;\r\n\tif((index + 1) % batchNumber == 0) {\r\n\t\tdb.randomData.insert(batchDocuments);\r\n\t}\r\n\tindex++;\r\n\tif(index % 100000 == 0) {\t\r\n\t\tprint(job_name + ' inserted ' + index + ' documents.');\r\n\t}\r\n}\r\nprint(job_name + ' inserted ' + documentNumber + ' in ' + (new Date() - start)\/1000.0 + 's');<\/pre>\n<p>This is how we run it and what we got:<\/p>\n<pre class=\" brush:bash\">mongo random --eval \"var arg1=50000000;arg2=1\" create_random.js\r\nJob#1 inserted 100000 documents.\r\nJob#1 inserted 200000 documents.\r\nJob#1 inserted 300000 documents.\r\n...\r\nJob#1 inserted 49900000 documents.\r\nJob#1 inserted 50000000 in 566.294s<\/pre>\n<p>Well, this is beyond my wild expectations already (88293 inserts\/second).<\/p>\n<p>Now it\u2019s Python\u2019s turn. You will need to install pymongo to properly run it.<\/p>\n<pre class=\" brush:java\">import sys\r\nimport os\r\nimport pymongo\r\nimport time\r\nimport random\r\n\r\nfrom datetime import datetime\r\n\r\nmin_date = datetime(2012, 1, 1)\r\nmax_date = datetime(2013, 1, 1)\r\ndelta = (max_date - min_date).total_seconds()\r\n\r\njob_id = '1'\r\n\r\nif len(sys.argv) &lt; 2:\r\n\tsys.exit(\"You must supply the item_number argument\")\r\nelif len(sys.argv) &gt; 2:\r\n\tjob_id = sys.argv[2]\t\r\n\r\ndocuments_number = int(sys.argv[1])\r\nbatch_number = 5 * 1000;\r\n\r\njob_name = 'Job#' + job_id\r\nstart = datetime.now();\r\n\r\n# obtain a mongo connection\r\nconnection = pymongo.Connection(\"mongodb:\/\/localhost\", safe=True)\r\n\r\n# obtain a handle to the random database\r\ndb = connection.random\r\ncollection = db.randomData\r\n\r\nbatch_documents = [i for i in range(batch_number)];\r\n\r\nfor index in range(documents_number):\r\n\ttry:\t\t\t\r\n\t\tdate = datetime.fromtimestamp(time.mktime(min_date.timetuple()) + int(round(random.random() * delta)))\r\n\t\tvalue = random.random()\r\n\t\tdocument = {\r\n\t\t\t'created_on' : date,\t\r\n\t\t\t'value' : value,\t\r\n\t\t}\r\n\t\tbatch_documents[index % batch_number] = document\r\n\t\tif (index + 1) % batch_number == 0:\r\n\t\t\tcollection.insert(batch_documents)\t\t\r\n\t\tindex += 1;\r\n\t\tif index % 100000 == 0:\t\r\n\t\t\tprint job_name, ' inserted ', index, ' documents.'\t\t\r\n\texcept:\r\n\t\tprint 'Unexpected error:', sys.exc_info()[0], ', for index ', index\r\n\t\traise\r\nprint job_name, ' inserted ', documents_number, ' in ', (datetime.now() - start).total_seconds(), 's'<\/pre>\n<p>We run it and this is what we got this time:<div style=\"display:inline-block; margin: 15px 0;\"> <div id=\"adngin-JavaCodeGeeks_incontent_video-0\" style=\"display:inline-block;\"><\/div> <\/div><\/p>\n<pre class=\" brush:bash\">python create_random.py 50000000\r\nJob#1  inserted  100000  documents.\r\nJob#1  inserted  200000  documents.\r\nJob#1  inserted  300000  documents.\r\n...\r\nJob#1  inserted  49900000  documents.\r\nJob#1  inserted  50000000  in  1713.501 s<\/pre>\n<p>This is slower compared to the JavaScript version (29180 inserts\/second), but lets not get discouraged. Python is a full-featured programming language, so how about taking advantage of all our CPU cores (e.g. 4 cores) and start one script per core, each one inserting a fraction of the total documents number (e.g. 12500000).<\/p>\n<pre class=\" brush:java\">import sys\r\nimport pymongo\r\nimport time\r\nimport subprocess\r\nimport multiprocessing\r\n\r\nfrom datetime import datetime\r\n\r\ncpu_count = multiprocessing.cpu_count()\r\n\r\n# obtain a mongo connection\r\nconnection = pymongo.Connection('mongodb:\/\/localhost', safe=True)\r\n\r\n# obtain a handle to the random database\r\ndb = connection.random\r\ncollection = db.randomData\r\n\r\ntotal_documents_count = 50 * 1000 * 1000;\r\ninserted_documents_count = 0\r\nsleep_seconds = 1\r\nsleep_count = 0\r\n\r\nfor i in range(cpu_count):\r\n\tdocuments_number = str(total_documents_count\/cpu_count)\r\n\tprint documents_number\r\n\tsubprocess.Popen(['python', '..\/create_random.py', documents_number, str(i)])\r\n\r\nstart = datetime.now();\r\n\r\nwhile (inserted_documents_count &lt; total_documents_count) is True:\r\n\tinserted_documents_count = collection.count()\r\n\tif (sleep_count &gt; 0 and sleep_count % 60 == 0):\t\r\n\t\tprint 'Inserted ', inserted_documents_count, ' documents.'\t\t\r\n\tif (inserted_documents_count &lt; total_documents_count):\r\n\t\tsleep_count += 1\r\n\t\ttime.sleep(sleep_seconds)\t\r\n\r\nprint 'Inserting ', total_documents_count, ' took ', (datetime.now() - start).total_seconds(), 's'<\/pre>\n<p>Running the parallel execution Python script goes like this:<\/p>\n<pre class=\" brush:bash\">python create_random_parallel.py\r\nJob#3  inserted  100000  documents.\r\nJob#2  inserted  100000  documents.\r\nJob#0  inserted  100000  documents.\r\nJob#1  inserted  100000  documents.\r\nJob#3  inserted  200000  documents.\r\n...\r\nJob#2  inserted  12500000  in  571.819 s\r\nJob#0  inserted  12400000  documents.\r\nJob#3  inserted  10800000  documents.\r\nJob#1  inserted  12400000  documents.\r\nJob#0  inserted  12500000  documents.\r\nJob#0  inserted  12500000  in  577.061 s\r\nJob#3  inserted  10900000  documents.\r\nJob#1  inserted  12500000  documents.\r\nJob#1  inserted  12500000  in  578.427 s\r\nJob#3  inserted  11000000  documents.\r\n...\r\nJob#3  inserted  12500000  in  623.999 s\r\nInserting  50000000  took  624.655 s<\/pre>\n<p>This is very good indeed (80044 inserts\/seconds) even if still slower than the first JavaScript import. So lets adapt this last Python script to run the JavaScript through multiple MongoDB shells.<\/p>\n<p>Since I couldn\u2019t supply the required arguments to the mongo command, to the sub-process started by the main python script, I came up with the following alternative:<\/p>\n<pre class=\" brush:java\">for i in range(cpu_count):\r\n\tdocuments_number = str(total_documents_count\/cpu_count)\r\n\tscript_name = 'create_random_' + str(i + 1) + '.bat'\r\n\tscript_file = open(script_name, 'w')\r\n\tscript_file.write('mongo random --eval \"var arg1=' + documents_number +';arg2=' + str(i + 1) +'\" ..\/create_random.js');\r\n\tscript_file.close()\r\n\tsubprocess.Popen(script_name)<\/pre>\n<p>We generate shell scripts dynamically and let python run them for us.<\/p>\n<pre class=\" brush:bash\">Job#1 inserted 100000 documents.\r\nJob#4 inserted 100000 documents.\r\nJob#3 inserted 100000 documents.\r\nJob#2 inserted 100000 documents.\r\nJob#1 inserted 200000 documents.\r\n...\r\nJob#4 inserted 12500000 in 566.438s\r\nJob#3 inserted 12300000 documents.\r\nJob#2 inserted 10800000 documents.\r\nJob#1 inserted 11600000 documents.\r\nJob#3 inserted 12400000 documents.\r\nJob#1 inserted 11700000 documents.\r\nJob#2 inserted 10900000 documents.\r\nJob#1 inserted 11800000 documents.\r\nJob#3 inserted 12500000 documents.\r\nJob#3 inserted 12500000 in 574.782s\r\nJob#2 inserted 11000000 documents.\r\nJob#1 inserted 11900000 documents.\r\nJob#2 inserted 11100000 documents.\r\nJob#1 inserted 12000000 documents.\r\nJob#2 inserted 11200000 documents.\r\nJob#1 inserted 12100000 documents.\r\nJob#2 inserted 11300000 documents.\r\nJob#1 inserted 12200000 documents.\r\nJob#2 inserted 11400000 documents.\r\nJob#1 inserted 12300000 documents.\r\nJob#2 inserted 11500000 documents.\r\nJob#1 inserted 12400000 documents.\r\nJob#2 inserted 11600000 documents.\r\nJob#1 inserted 12500000 documents.\r\nJob#1 inserted 12500000 in 591.073s\r\nJob#2 inserted 11700000 documents.\r\n...\r\nJob#2 inserted 12500000 in 599.005s\r\nInserting  50000000  took  599.253 s<\/pre>\n<p>This is fast too (83437 inserts\/second) but still can\u2019t beat our first attempt.<\/p>\n<h2>Conclusion<\/h2>\n<p>My PC configuration is nothing out of the ordinary, and the only optimization is that I have a SSD drive on which MongoDB runs.<\/p>\n<p><a href=\"http:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/12\/vlad_pc.png\"><img decoding=\"async\" class=\"aligncenter size-medium wp-image-19383\" alt=\"vlad_pc\" src=\"http:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/12\/vlad_pc-300x146.png\" width=\"300\" height=\"146\" srcset=\"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/12\/vlad_pc-300x146.png 300w, https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2013\/12\/vlad_pc.png 543w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/a><\/p>\n<p>The first attempt yielded the best results, and monitoring CPU resources I realized MongoDB leverages all of them even for a single shell console. The Python script running on all cores was also fast enough and it has the advantage of allowing us to turn this script into a fully operational application, if we want to.<\/p>\n<ul>\n<li>Code available on <a href=\"https:\/\/github.com\/vladmihalcea\/vladmihalcea.wordpress.com\/tree\/master\/mongodb-facts\/data-generator\/timeseries\">GitHub<\/a>.<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<div style=\"border: 1px solid #D8D8D8; background: #FAFAFA; width: 100%; padding-left: 5px;\"><b><i>Reference: <\/i><\/b><a href=\"http:\/\/vladmihalcea.wordpress.com\/2013\/12\/01\/mongodb-facts-80000-insertssecond-on-commodity-hardware\/\">MongoDB Facts: 80000+ inserts\/second on commodity hardware<\/a> from our <a href=\"http:\/\/www.javacodegeeks.com\/jcg\">JCG partner<\/a> Vlad Mihalcea at the <a href=\"http:\/\/vladmihalcea.wordpress.com\/\">Vlad Mihalcea&#8217;s Blog<\/a> blog.<\/div>\n","protected":false},"excerpt":{"rendered":"<p>While experimenting with some time series collections I needed a large data set to check that our aggregation queries don\u2019t become a bottleneck in case of increasing data loads. We settled for 50 million documents, since beyond this number we would consider sharding anyway. Each time event looks like this: &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &hellip;<\/p>\n","protected":false},"author":507,"featured_media":194,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[8],"tags":[112,113],"class_list":["post-19356","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-enterprise-java","tag-mongodb","tag-nosql"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>MongoDB Facts: 80000+ inserts\/second on commodity hardware<\/title>\n<meta name=\"description\" content=\"While experimenting with some time series collections I needed a large data set to check that our aggregation queries don\u2019t become a bottleneck in case of\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"MongoDB Facts: 80000+ inserts\/second on commodity hardware\" \/>\n<meta property=\"og:description\" content=\"While experimenting with some time series collections I needed a large data set to check that our aggregation queries don\u2019t become a bottleneck in case of\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html\" \/>\n<meta property=\"og:site_name\" content=\"Java Code Geeks\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/javacodegeeks\" \/>\n<meta property=\"article:author\" content=\"https:\/\/www.facebook.com\/vlad.mihalcea.71\" \/>\n<meta property=\"article:published_time\" content=\"2013-12-03T11:00:31+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/nosqlunit-logo.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"150\" \/>\n\t<meta property=\"og:image:height\" content=\"150\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Vlad Mihalcea\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@javacodegeeks\" \/>\n<meta name=\"twitter:site\" content=\"@javacodegeeks\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Vlad Mihalcea\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"6 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html\"},\"author\":{\"name\":\"Vlad Mihalcea\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/person\\\/2c2d5059ee4fd88b1b3b9e52efc5b129\"},\"headline\":\"MongoDB Facts: 80000+ inserts\\\/second on commodity hardware\",\"datePublished\":\"2013-12-03T11:00:31+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html\"},\"wordCount\":429,\"commentCount\":2,\"publisher\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2012\\\/10\\\/nosqlunit-logo.jpg\",\"keywords\":[\"MongoDB\",\"NoSQL\"],\"articleSection\":[\"Enterprise Java\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html\",\"name\":\"MongoDB Facts: 80000+ inserts\\\/second on commodity hardware\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2012\\\/10\\\/nosqlunit-logo.jpg\",\"datePublished\":\"2013-12-03T11:00:31+00:00\",\"description\":\"While experimenting with some time series collections I needed a large data set to check that our aggregation queries don\u2019t become a bottleneck in case of\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#primaryimage\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2012\\\/10\\\/nosqlunit-logo.jpg\",\"contentUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2012\\\/10\\\/nosqlunit-logo.jpg\",\"width\":150,\"height\":150},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/2013\\\/12\\\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/www.javacodegeeks.com\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Java\",\"item\":\"https:\\\/\\\/www.javacodegeeks.com\\\/category\\\/java\"},{\"@type\":\"ListItem\",\"position\":3,\"name\":\"Enterprise Java\",\"item\":\"https:\\\/\\\/www.javacodegeeks.com\\\/category\\\/java\\\/enterprise-java\"},{\"@type\":\"ListItem\",\"position\":4,\"name\":\"MongoDB Facts: 80000+ inserts\\\/second on commodity hardware\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#website\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/\",\"name\":\"Java Code Geeks\",\"description\":\"Java Developers Resource Center\",\"publisher\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#organization\"},\"alternateName\":\"JCG\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/www.javacodegeeks.com\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#organization\",\"name\":\"Exelixis Media P.C.\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2022\\\/06\\\/exelixis-logo.png\",\"contentUrl\":\"https:\\\/\\\/www.javacodegeeks.com\\\/wp-content\\\/uploads\\\/2022\\\/06\\\/exelixis-logo.png\",\"width\":864,\"height\":246,\"caption\":\"Exelixis Media P.C.\"},\"image\":{\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/javacodegeeks\",\"https:\\\/\\\/x.com\\\/javacodegeeks\"]},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/www.javacodegeeks.com\\\/#\\\/schema\\\/person\\\/2c2d5059ee4fd88b1b3b9e52efc5b129\",\"name\":\"Vlad Mihalcea\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/f9f4ac0b2229b9f9fb993393b822ffbf63e60c1665a244176d3c4728565a9a9f?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/f9f4ac0b2229b9f9fb993393b822ffbf63e60c1665a244176d3c4728565a9a9f?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/f9f4ac0b2229b9f9fb993393b822ffbf63e60c1665a244176d3c4728565a9a9f?s=96&d=mm&r=g\",\"caption\":\"Vlad Mihalcea\"},\"description\":\"Vlad Mihalcea is a software architect passionate about software integration, high scalability and concurrency challenges.\",\"sameAs\":[\"http:\\\/\\\/vladmihalcea.wordpress.com\\\/\",\"https:\\\/\\\/www.facebook.com\\\/vlad.mihalcea.71\",\"http:\\\/\\\/www.linkedin.com\\\/pub\\\/vlad-mihalcea\\\/20\\\/a59\\\/580\"],\"url\":\"https:\\\/\\\/www.javacodegeeks.com\\\/author\\\/vlad-mihalcea\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"MongoDB Facts: 80000+ inserts\/second on commodity hardware","description":"While experimenting with some time series collections I needed a large data set to check that our aggregation queries don\u2019t become a bottleneck in case of","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html","og_locale":"en_US","og_type":"article","og_title":"MongoDB Facts: 80000+ inserts\/second on commodity hardware","og_description":"While experimenting with some time series collections I needed a large data set to check that our aggregation queries don\u2019t become a bottleneck in case of","og_url":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html","og_site_name":"Java Code Geeks","article_publisher":"https:\/\/www.facebook.com\/javacodegeeks","article_author":"https:\/\/www.facebook.com\/vlad.mihalcea.71","article_published_time":"2013-12-03T11:00:31+00:00","og_image":[{"width":150,"height":150,"url":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/nosqlunit-logo.jpg","type":"image\/jpeg"}],"author":"Vlad Mihalcea","twitter_card":"summary_large_image","twitter_creator":"@javacodegeeks","twitter_site":"@javacodegeeks","twitter_misc":{"Written by":"Vlad Mihalcea","Est. reading time":"6 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#article","isPartOf":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html"},"author":{"name":"Vlad Mihalcea","@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/person\/2c2d5059ee4fd88b1b3b9e52efc5b129"},"headline":"MongoDB Facts: 80000+ inserts\/second on commodity hardware","datePublished":"2013-12-03T11:00:31+00:00","mainEntityOfPage":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html"},"wordCount":429,"commentCount":2,"publisher":{"@id":"https:\/\/www.javacodegeeks.com\/#organization"},"image":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#primaryimage"},"thumbnailUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/nosqlunit-logo.jpg","keywords":["MongoDB","NoSQL"],"articleSection":["Enterprise Java"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#respond"]}]},{"@type":"WebPage","@id":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html","url":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html","name":"MongoDB Facts: 80000+ inserts\/second on commodity hardware","isPartOf":{"@id":"https:\/\/www.javacodegeeks.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#primaryimage"},"image":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#primaryimage"},"thumbnailUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/nosqlunit-logo.jpg","datePublished":"2013-12-03T11:00:31+00:00","description":"While experimenting with some time series collections I needed a large data set to check that our aggregation queries don\u2019t become a bottleneck in case of","breadcrumb":{"@id":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#primaryimage","url":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/nosqlunit-logo.jpg","contentUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2012\/10\/nosqlunit-logo.jpg","width":150,"height":150},{"@type":"BreadcrumbList","@id":"https:\/\/www.javacodegeeks.com\/2013\/12\/mongodb-facts-80000-insertssecond-on-commodity-hardware.html#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.javacodegeeks.com\/"},{"@type":"ListItem","position":2,"name":"Java","item":"https:\/\/www.javacodegeeks.com\/category\/java"},{"@type":"ListItem","position":3,"name":"Enterprise Java","item":"https:\/\/www.javacodegeeks.com\/category\/java\/enterprise-java"},{"@type":"ListItem","position":4,"name":"MongoDB Facts: 80000+ inserts\/second on commodity hardware"}]},{"@type":"WebSite","@id":"https:\/\/www.javacodegeeks.com\/#website","url":"https:\/\/www.javacodegeeks.com\/","name":"Java Code Geeks","description":"Java Developers Resource Center","publisher":{"@id":"https:\/\/www.javacodegeeks.com\/#organization"},"alternateName":"JCG","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.javacodegeeks.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.javacodegeeks.com\/#organization","name":"Exelixis Media P.C.","url":"https:\/\/www.javacodegeeks.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/logo\/image\/","url":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2022\/06\/exelixis-logo.png","contentUrl":"https:\/\/www.javacodegeeks.com\/wp-content\/uploads\/2022\/06\/exelixis-logo.png","width":864,"height":246,"caption":"Exelixis Media P.C."},"image":{"@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/javacodegeeks","https:\/\/x.com\/javacodegeeks"]},{"@type":"Person","@id":"https:\/\/www.javacodegeeks.com\/#\/schema\/person\/2c2d5059ee4fd88b1b3b9e52efc5b129","name":"Vlad Mihalcea","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/f9f4ac0b2229b9f9fb993393b822ffbf63e60c1665a244176d3c4728565a9a9f?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/f9f4ac0b2229b9f9fb993393b822ffbf63e60c1665a244176d3c4728565a9a9f?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/f9f4ac0b2229b9f9fb993393b822ffbf63e60c1665a244176d3c4728565a9a9f?s=96&d=mm&r=g","caption":"Vlad Mihalcea"},"description":"Vlad Mihalcea is a software architect passionate about software integration, high scalability and concurrency challenges.","sameAs":["http:\/\/vladmihalcea.wordpress.com\/","https:\/\/www.facebook.com\/vlad.mihalcea.71","http:\/\/www.linkedin.com\/pub\/vlad-mihalcea\/20\/a59\/580"],"url":"https:\/\/www.javacodegeeks.com\/author\/vlad-mihalcea"}]}},"_links":{"self":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/posts\/19356","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/users\/507"}],"replies":[{"embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/comments?post=19356"}],"version-history":[{"count":0,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/posts\/19356\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/media\/194"}],"wp:attachment":[{"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/media?parent=19356"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/categories?post=19356"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.javacodegeeks.com\/wp-json\/wp\/v2\/tags?post=19356"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}