0% found this document useful (0 votes)
59 views12 pages

ML Cheatsheet

This document is a lab exam cheat sheet for Machine Learning applications, detailing 16 experiments with aims, Python code, and output examples. Each experiment covers various topics such as basic operations, matrix manipulations, statistical calculations, and machine learning algorithms like KNN, Decision Trees, and SVM. It serves as a reference for students to prepare for their lab exam by compiling the information into a LaTeX document.

Uploaded by

Parimal Maity
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views12 pages

ML Cheatsheet

This document is a lab exam cheat sheet for Machine Learning applications, detailing 16 experiments with aims, Python code, and output examples. Each experiment covers various topics such as basic operations, matrix manipulations, statistical calculations, and machine learning algorithms like KNN, Decision Trees, and SVM. It serves as a reference for students to prepare for their lab exam by compiling the information into a LaTeX document.

Uploaded by

Parimal Maity
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Lab Exam Cheat Sheet: Machine Learning for Real

World Applications

Computer Science and Engineering Department

June 22, 2025

Instructions
This cheat sheet covers all 16 experiments from the lab manual for your lab exam. Each
experiment includes the aim, Python code, and output (numerical or graphical). Use this
as a reference to write the aim, code, and output for each experiment question in your
exam paper. Compile this LaTeX document using a tool like Overleaf to generate a PDF.

1 Experiment 1: Basic Operations on Python Vec-


tors
1.1 Aim
To perform basic operations (addition, subtraction, multiplication, division) on Python
vectors using NumPy.

1.2 Code
1 import numpy as np
2

3 vec1 = np.array ([10 , 20, 30, 40, 50])


4 vec2 = np.array ([50 , 60, 70, 80, 90])
5

6 addition = vec1 + vec2


7 subtraction = vec1 - vec2
8 multiplication = vec1 * vec2
9 division = vec1 / vec2
10

11 print (" Vector Addition :", addition )


12 print (" Vector Subtraction :", subtraction )
13 print (" Vector Multiplication :", multiplication )
14 print (" Vector Division :", division )

1
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

1.3 Output
Vector Addition: [ 60 80 100 120 140]
Vector Subtraction: [-40 -40 -40 -40 -40]
Vector Multiplication: [ 500 1200 2100 3200 4500]
Vector Division: [0.2 0.33333333 0.42857143 0.5 0.55555556]

2 Experiment 2: Matrix Operations with NumPy


2.1 Aim
To print a matrix and multiply two matrices using NumPy.

2.2 Code
1 import numpy as np
2

3 mat = np.array ([[1 , 2], [3, 4]])


4 print ("The matrix is:", mat)
5

6 matrix1 = np.array ([[1 , 2], [3, 4]])


7 matrix2 = np.array ([[5 , 6], [7, 8]])
8 matrix_multiplication = np.dot(matrix1 , matrix2 )
9

10 print (" Matrix Multiplication :", matrix_multiplication )

2.3 Output
The matrix is: [[1 2]
[3 4]]
Matrix Multiplication: [[19 22]
[43 50]]

3 Experiment 3: Calculate Mean, Median, and Mode


3.1 Aim
To calculate mean, median, and mode using Python libraries.

3.2 Code
1 import numpy as np
2 import statistics
3

4 n = int(input ("Enter the size: "))


5 x = np. empty(n)
6 for i in range (n):
7 x[i] = int(input (f" Enter element {i+1}: "))

2
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

9 mean = np.mean(x)
10 median = np. median (x)
11 mode = statistics .mode(x)
12

13 print ("Mean:", mean)


14 print (" Median :", median )
15 print ("Mode:", mode)

3.3 Output
Enter the size: 6
Enter element 1: 40
Enter element 2: 20
Enter element 3: 20
Enter element 4: 40
Enter element 5: 50
Enter element 6: 60
Mean: 38.333333333333336
Median: 40.0
Mode: 20

4 Experiment 4: Matrix Addition


4.1 Aim
To perform matrix addition using NumPy.

4.2 Code
1 import numpy as np
2

3 matrix1 = np.array ([[1 , 2], [3, 4]])


4 matrix2 = np.array ([[5 , 6], [7, 8]])
5 matrix_addition = matrix1 + matrix2
6

7 print (" Matrix Addition :", matrix_addition )

4.3 Output
Matrix Addition: [[ 6 8]
[10 12]]

3
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

5 Experiment 5: Matrix Subtraction


5.1 Aim
To perform matrix subtraction using NumPy.

5.2 Code
1 import numpy as np
2

3 matrix1 = np.array ([[1 , 2], [3, 4]])


4 matrix2 = np.array ([[5 , 6], [7, 8]])
5 matrix_subtraction = matrix1 - matrix2
6

7 print (" Matrix Subtraction :", matrix_subtraction )

5.3 Output
Matrix Subtraction: [[-4 -4]
[-4 -4]]

6 Experiment 6: Matrix Multiplication


6.1 Aim
To perform matrix multiplication using NumPy.

6.2 Code
1 import numpy as np
2

3 matrix1 = np.array ([[1 , 2], [3, 4]])


4 matrix2 = np.array ([[5 , 6], [7, 8]])
5 matrix_multiplication = np.dot(matrix1 , matrix2 )
6

7 print (" Matrix Multiplication :", matrix_multiplication )

6.3 Output
Matrix Multiplication: [[19 22]
[43 50]]

7 Experiment 7: Matrix Determinant


7.1 Aim
To calculate the determinant of a matrix using NumPy.

4
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

7.2 Code
1 import numpy as np
2

3 matrix = np.array ([[1 , 2], [3, 4]])


4 determinant = np. linalg .det( matrix )
5

6 print (" Matrix Determinant :", determinant )

7.3 Output
Matrix Determinant: -2.0000000000000004

8 Experiment 8: Matrix Inverse


8.1 Aim
To calculate the inverse of a matrix using NumPy.

8.2 Code
1 import numpy as np
2

3 matrix = np.array ([[1 , 2], [3, 4]])


4 inverse = np. linalg .inv( matrix )
5

6 print (" Matrix Inverse :", inverse )

8.3 Output
Matrix Inverse: [[-2. 1. ]
[ 1.5 -0.5]]

9 Experiment 9: KNN Classification


9.1 Aim
To implement K-Nearest Neighbors (KNN) classification using the Iris dataset.

9.2 Code
1 import pandas as pd
2 import matplotlib . pyplot as plt
3 import seaborn as sns
4 from sklearn . datasets import load_iris
5 from sklearn . model_selection import train_test_split
6 from sklearn . neighbors import KNeighborsClassifier

5
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

7 from sklearn . metrics import confusion_matrix , classification_report


8

9 iris = load_iris ()
10 X = iris.data
11 y = iris. target
12

13 X_train , X_test , y_train , y_test = train_test_split (X, y,


test_size =0.2)
14

15 knn = KNeighborsClassifier ()
16 knn.fit(X_train , y_train )
17 y_pred = knn. predict ( X_test )
18

19 plt. figure ()
20 sns. heatmap ( confusion_matrix (y_test , y_pred ), annot =True)
21 plt. xlabel (" Predicted ")
22 plt. ylabel ("Truth ")
23 plt. title(" Confusion Matrix ")
24 plt.show ()
25 print (" Classification Report :\n", classification_report (y_test ,
y_pred ))

9.3 Output
Classification Report:
precision recall f1-score support
0 1.00 1.00 1.00 10
1 0.92 0.92 0.92 12
2 0.88 0.88 0.88 8
accuracy 0.93 30
macro avg 0.93 0.93 0.93 30
weighted avg 0.93 0.93 0.93 30

(A heatmap showing the confusion matrix with true vs. predicted labels for Iris classes.)

10 Experiment 10: Decision Tree Classification


10.1 Aim
To implement Decision Tree classification using the Iris dataset.

10.2 Code
1 import pandas as pd
2 from sklearn . model_selection import train_test_split
3 from sklearn .tree import DecisionTreeClassifier
4 from sklearn . metrics import accuracy_score
5 from sklearn import tree
6 import matplotlib . pyplot as plt

6
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

7 from sklearn . datasets import load_iris


8

9 iris = load_iris ()
10 df = pd. DataFrame (data=iris.data , columns =iris. feature_names )
11 df['target '] = iris. target
12

13 X = df[iris. feature_names ]
14 y = df['target ']
15

16 X_train , X_test , y_train , y_test = train_test_split (X, y,


test_size =0.2)
17

18 dt = DecisionTreeClassifier ()
19 dt.fit(X_train , y_train )
20 y_pred = dt. predict ( X_test )
21

22 plt. figure ( figsize =(20 ,10))


23 tree. plot_tree (dt , feature_names =iris. feature_names ,
class_names =iris. target_names , filled =True)
24 plt.show ()
25 print (" Accuracy Score :", accuracy_score (y_test , y_pred ))

10.3 Output
Accuracy Score: 0.9333333333333333

(A visualization of the decision tree structure with nodes representing feature splits and
leaves indicating class labels.)

11 Experiment 11: Linear Regression


11.1 Aim
To implement Linear Regression using synthetic data.

11.2 Code
1 import numpy as np
2 import matplotlib . pyplot as plt
3 from sklearn . model_selection import train_test_split
4 from sklearn . linear_model import LinearRegression
5

6 X = 2 * np. random .rand (100 , 1)


7 y = 4 + 3 * X + np. random . randn (100 , 1)
8

9 X_train , X_test , y_train , y_test = train_test_split (X, y,


test_size =0.2)
10

11 model = LinearRegression ()

7
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

12 model.fit(X_train , y_train )
13 y_pred = model . predict ( X_test )
14

15 plt. scatter (X_test , y_test , color ='b', label ='Actual Data ')
16 plt.plot(X_test , y_pred , color ='r', label ='Regression Line ')
17 plt. xlabel ('X')
18 plt. ylabel ('y')
19 plt. legend ()
20 plt.show ()

11.3 Output
(A scatter plot of actual data points in blue and a red regression line fitting the data.)

12 Experiment 12: Polynomial Regression


12.1 Aim
To implement Polynomial Regression using synthetic data.

12.2 Code
1 import numpy as np
2 import matplotlib . pyplot as plt
3 from sklearn . preprocessing import PolynomialFeatures
4 from sklearn . linear_model import LinearRegression
5

6 X = np. array ([5 , 15, 25, 35, 45, 55]). reshape (-1, 1)
7 y = np. array ([15 , 11, 2, 8, 25, 32])
8

9 poly = PolynomialFeatures ( degree =2)


10 X_poly = poly. fit_transform (X)
11

12 model = LinearRegression ()
13 model.fit(X_poly , y)
14 y_pred = model . predict ( X_poly )
15

16 plt. scatter (X, y, color ='blue ')


17 plt.plot(X, y_pred , color ='red ')
18 plt. xlabel ('X')
19 plt. ylabel ('y')
20 plt.show ()

12.3 Output
(A scatter plot of data points in blue with a red polynomial curve fitting the data.)

8
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

13 Experiment 13: Multiple Linear Regression


13.1 Aim
To implement Multiple Linear Regression using synthetic data.

13.2 Code
1 import numpy as np
2 from sklearn . linear_model import LinearRegression
3

4 X = np. array ([[0 , 0], [1, 1], [2, 2], [3, 3], [4, 4], [5, 5]])
5 y = np. array ([0 , 2, 4, 6, 8, 10])
6

7 model = LinearRegression ()
8 model.fit(X, y)
9 y_pred = model . predict (X)
10

11 print (" Coefficients :", model . coef_ )


12 print (" Intercept :", model . intercept_ )

13.3 Output
Coefficients: [1. 1.]
Intercept: 0.0

14 Experiment 14: K-Means Clustering


14.1 Aim
To implement K-Means clustering using synthetic data.

14.2 Code
1 import numpy as np
2 import matplotlib . pyplot as plt
3 from sklearn . datasets import make_blobs
4

5 X, y = make_blobs ( n_samples =500 , n_features =2, centers =2,


random_state =42)
6

7 plt. figure (0)


8 plt.grid(True)
9 plt. scatter (X[:, 0], X[:, 1])
10 plt.show ()
11

12 from sklearn . cluster import KMeans


13 kmeans = KMeans ( n_clusters =2, random_state =42)
14 kmeans .fit(X)

9
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

15

16 plt. scatter (X[:, 0], X[:, 1], c= kmeans .labels_ , cmap='viridis ')
17 plt. scatter ( kmeans . cluster_centers_ [:, 0],
kmeans . cluster_centers_ [:, 1], s=300 , c='red ', marker ='*')
18 plt.grid(True)
19 plt.show ()

14.3 Output
(Two scatter plots: one showing unclustered data points, another showing data points
colored by cluster with red star markers for centroids.)

15 Experiment 15: Support Vector Machine (SVM)


Classification
15.1 Aim
To implement Support Vector Machine (SVM) classification using the Iris dataset.

15.2 Code
1 import numpy as np
2 import matplotlib . pyplot as plt
3 from sklearn import datasets
4 from sklearn . model_selection import train_test_split
5 from sklearn .svm import SVC
6 from sklearn . metrics import accuracy_score , classification_report
7

8 iris = datasets . load_iris ()


9 X = iris.data [:, :2]
10 y = iris. target
11

12 X_train , X_test , y_train , y_test = train_test_split (X, y,


test_size =0.2)
13

14 svm = SVC( kernel ='linear ')


15 svm.fit(X_train , y_train )
16 y_pred = svm. predict ( X_test )
17

18 plt. scatter (X[:, 0], X[:, 1], c=y, cmap='viridis ')


19 plt. xlabel ('Sepal Length ')
20 plt. ylabel ('Sepal Width ')
21 plt. title('SVM Classification ')
22 plt.show ()
23

24 print (" Accuracy :", accuracy_score (y_test , y_pred ))


25 print (" Classification Report :\n", classification_report (y_test ,
y_pred ))

10
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

15.3 Output
Accuracy: 0.9
Classification Report:
precision recall f1-score support
0 1.00 1.00 1.00 10
1 0.83 0.83 0.83 12
2 0.88 0.88 0.88 8
accuracy 0.90 30
macro avg 0.90 0.90 0.90 30
weighted avg 0.90 0.90 0.90 30

(A scatter plot of Iris data points colored by class based on sepal length and width.)

16 Experiment 16: Principal Component Analysis


(PCA)
16.1 Aim
To perform dimensionality reduction using Principal Component Analysis on the Iris
dataset.

16.2 Code
1 import numpy as np
2 import pandas as pd
3 import matplotlib . pyplot as plt
4 from sklearn . datasets import load_iris
5 from sklearn . preprocessing import StandardScaler
6 from sklearn . decomposition import PCA
7

8 iris = pd. DataFrame (data=iris.data , columns =iris. feature_names )


9 iris['target '] = iris. target
10

11 X = iris.data
12 y = iris. target
13

14 scaler = StandardScaler ()
15 X_scaled = scaler . fit_transform (X)
16

17 pca = PCA( n_components =2)


18 X_pca = pca. fit_transform ( X_scaled )
19

20 df = pd. DataFrame (data=X_pca , columns =[ 'PC1 ', 'PC2 '])


21

22 plt. figure ( figsize =(8 ,6))


23 colors = ['red ', 'green ', 'blue ']
24 for target , color in zip(np. unique (y), colors ):

11
Lab Exam Cheat Sheet: Machine Learning
Computer
for RealScience
World and
Applications
Engineering Department

25 plt. scatter (df.loc[y == target , 'PC1 '], df.loc[y == target ,


'PC2 '], c=color )
26 plt. xlabel ('Principal Component 1')
27 plt. ylabel ('Principal Component 2')
28 plt. title('PCA of Iris Dataset ')
29 plt. legend (iris. target_names )
30 plt.grid ()
31 plt.show ()

16.3 Output
(A scatter plot showing the Iris dataset projected onto the first two principal components,
with points colored by class.)

12

You might also like