{"id":19043,"date":"2020-09-18T19:59:46","date_gmt":"2020-09-18T19:59:46","guid":{"rendered":"https:\/\/ittutorial.org\/?p=19043"},"modified":"2020-09-25T14:14:54","modified_gmt":"2020-09-25T14:14:54","slug":"dimension-reduction-python-unsupervised-learning-5","status":"publish","type":"post","link":"https:\/\/ittutorial.org\/dimension-reduction-python-unsupervised-learning-5\/","title":{"rendered":"Dimension reduction | Python Unsupervised Learning -5"},"content":{"rendered":"<p>Hello, in this article, we continue the topic Unsupervised Learning.<\/p>\n<p>&nbsp;<\/p>\n<p><!--more--><\/p>\n<p>&nbsp;<\/p>\n<p>Read the previous post before this post.<\/p>\n<blockquote class=\"wp-embedded-content\" data-secret=\"b7fOTEG1pO\"><p><a href=\"https:\/\/ittutorial.org\/python-unsupervised-learning-4\/\">t-SNE visualization | Python Unsupervised Learning -4<\/a><\/p><\/blockquote>\n<p><iframe loading=\"lazy\" class=\"wp-embedded-content\" sandbox=\"allow-scripts\" security=\"restricted\" style=\"position: absolute; clip: rect(1px, 1px, 1px, 1px);\" title=\"&#8220;t-SNE visualization | Python Unsupervised Learning -4&#8221; &#8212; IT Tutorial\" src=\"https:\/\/ittutorial.org\/python-unsupervised-learning-4\/embed\/#?secret=MAPrG7XcRL#?secret=b7fOTEG1pO\" data-secret=\"b7fOTEG1pO\" width=\"600\" height=\"338\" frameborder=\"0\" marginwidth=\"0\" marginheight=\"0\" scrolling=\"no\"><\/iframe><\/p>\n<p>&nbsp;<\/p>\n<h1 class=\"dc-chapter__title\"><span style=\"color: #000000;\">Dimension reduction<\/span><\/h1>\n<p>Dimension reduction finds patterns in data, and uses these patterns to\u00a0 re-express it in a compressed form.\u00a0 This makes subsequent computation with the data much more efficient and this can be a big deal in a world of big dataset.<\/p>\n<p>&nbsp;<\/p>\n<h3>Principal Component Analysis (PCA)<\/h3>\n<p>PCA performs dimension reduction in two steps, and the first one, called &#8220;de-correlation&#8221; , doesn&#8217;t change the dimension of the data at all.<\/p>\n<p>&nbsp;<\/p>\n<h2>Example<\/h2>\n<p><strong><a href=\"https:\/\/drive.google.com\/file\/d\/1v6c5KhEXPvGinR0alinqYaqo-_of9agw\/view?usp=sharing\">https:\/\/drive.google.com\/file\/d\/1v6c5KhEXPvGinR0alinqYaqo-_of9agw\/view?usp=sharing<\/a><\/strong><\/p>\n<p>You can access the entire linked code above.<\/p>\n<p>&nbsp;<\/p>\n<pre>import matplotlib.pyplot as plt\r\nfrom scipy.stats import pearsonr\r\n\r\nwidth = grains[:,0]\r\n\r\n# Assign the 1st column of grains: length\r\nlength = grains[:,1]\r\n\r\n# Scatter plot width vs length\r\nplt.scatter(width,length)\r\nplt.axis('equal')\r\nplt.show()\r\n\r\n# Calculate the Pearson correlation\r\ncorrelation, pvalue = pearsonr(width,length)\r\n\r\n# Display the correlation\r\nprint(correlation)<\/pre>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-19044\" src=\"https:\/\/ittutorial.org\/wp-content\/uploads\/2020\/09\/Screenshot_19-1.png\" alt=\"\" width=\"992\" height=\"489\" srcset=\"https:\/\/ittutorial.org\/wp-content\/uploads\/2020\/09\/Screenshot_19-1.png 793w, https:\/\/ittutorial.org\/wp-content\/uploads\/2020\/09\/Screenshot_19-1-300x148.png 300w, https:\/\/ittutorial.org\/wp-content\/uploads\/2020\/09\/Screenshot_19-1-768x379.png 768w\" sizes=\"auto, (max-width: 992px) 100vw, 992px\" \/><\/p>\n<pre>from sklearn.decomposition import PCA\r\n\r\n# Create PCA instance: model\r\nmodel = PCA()\r\n\r\n# Apply the fit_transform method of model to grains: pca_features\r\npca_features = model.fit_transform(grains)\r\n\r\n# Assign 0th column of pca_features: xs\r\nxs = pca_features[:,0]\r\n\r\n# Assign 1st column of pca_features: ys\r\nys = pca_features[:,1]\r\n\r\n# Scatter plot xs vs ys\r\nplt.scatter(xs, ys)\r\nplt.axis('equal')\r\nplt.show()\r\n\r\n# Calculate the Pearson correlation of xs and ys\r\ncorrelation, pvalue = pearsonr(xs, ys)\r\n\r\n# Display the correlation\r\nprint(correlation)<\/pre>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-19045\" src=\"https:\/\/ittutorial.org\/wp-content\/uploads\/2020\/09\/Screenshot_20-1.png\" alt=\"\" width=\"985\" height=\"553\" srcset=\"https:\/\/ittutorial.org\/wp-content\/uploads\/2020\/09\/Screenshot_20-1.png 639w, https:\/\/ittutorial.org\/wp-content\/uploads\/2020\/09\/Screenshot_20-1-300x169.png 300w\" sizes=\"auto, (max-width: 985px) 100vw, 985px\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>See you in the next article<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Hello, in this article, we continue the topic Unsupervised Learning. &nbsp;<\/p>\n","protected":false},"author":67,"featured_media":18628,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"om_disable_all_campaigns":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[12904],"tags":[12362,12909,12905,12865,12898,12883,12864,12886,12889,13264,13613,13614,13615,13618,13612,13611,13616,13617,13263,12900,12899,12910,12876,12877,12878,12879,12888,12913,12907,12908,12906,13259,13262,13261,13265,12896,12872,12871,12891,12911,12868,12884,12897,12875,12869,12881,12894,12870,12880,12893,12895,12902,12890,12912,13260,12914,12901,12885,12882,12892,12915,12916,12867,13004,13005,12866,12874,12873,12887],"class_list":["post-19043","post","type-post","status-publish","format-standard","has-post-thumbnail","","category-data-science","tag-advance-python","tag-clustering-quality","tag-cross-tabulation","tag-data-science","tag-data-science-example-of-unsupervised-learning","tag-data-science-in-python","tag-datascience","tag-denetimsiz-ogrenme","tag-derin-ogrenme","tag-dimension-reduction","tag-dimensionality-reduction-example","tag-dimensionality-reduction-in-r","tag-dimensionality-reduction-pca","tag-dimensionality-reduction-ppt","tag-dimensionality-reduction-python","tag-dimensionality-reduction-sklearn","tag-dimensionality-reduction-tutorial","tag-dimensionality-reduction-visualization","tag-dimensionel-reduction","tag-example-of-supervised-learning","tag-example-of-unsupervised-learning","tag-inertia-measures","tag-iris-dataset-examle","tag-k-means-example","tag-kmeans-example","tag-kmeans-example-in-python","tag-makina-ogrenmesi","tag-matplotlib","tag-numpy","tag-numpy-array-in-python","tag-pandas","tag-pca","tag-principal-component","tag-principal-component-analys","tag-principal-component-analysis-pca","tag-python-advance-clustering","tag-python-classification","tag-python-clustering","tag-python-clustering-ornekleri","tag-python-cross-validation","tag-python-data-science","tag-python-deep-learning","tag-python-example-of-unsupervised-learning","tag-python-iris-dataset","tag-python-k-means","tag-python-k-means-examle","tag-python-k-means-ornek","tag-python-kmeans","tag-python-kmeans-example","tag-python-knn-ornekleri","tag-python-kumeleme-ornegi","tag-python-machine-learning-example","tag-python-makina-ogrenmesi","tag-python-matplotlib","tag-python-pca","tag-python-sklearn","tag-python-supervised-learning-example","tag-python-unlabeled-data","tag-python-unsupervised-learning-example","tag-python-unsupervised-learning-uygulamalari","tag-sklearn-clustering","tag-sklearn-cluster-kmeans","tag-supervised-learning","tag-t-sne","tag-t-sne-visualization","tag-unsupervised-learning","tag-unsupervised-learning-classification","tag-unsupervised-learning-clustering","tag-veri-bilimi"],"aioseo_notices":[],"jetpack_featured_media_url":"https:\/\/ittutorial.org\/wp-content\/uploads\/2020\/09\/indir.png","jetpack_sharing_enabled":true,"amp_enabled":true,"_links":{"self":[{"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/posts\/19043","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/users\/67"}],"replies":[{"embeddable":true,"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/comments?post=19043"}],"version-history":[{"count":5,"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/posts\/19043\/revisions"}],"predecessor-version":[{"id":19411,"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/posts\/19043\/revisions\/19411"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/media\/18628"}],"wp:attachment":[{"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/media?parent=19043"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/categories?post=19043"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ittutorial.org\/wp-json\/wp\/v2\/tags?post=19043"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}