Lab 6: Boosting (AdaBoost) with Decision Trees
Objective:
Implement AdaBoost for classification to improve performance by combining weak classifiers.
Code:
from [Link] import load_iris
from [Link] import AdaBoostClassifier
from [Link] import DecisionTreeClassifier
from sklearn.model_selection import train_test_split
from [Link] import accuracy_score
iris = load_iris()
X_train, X_test, y_train, y_test = train_test_split([Link], [Link], test_size=0.3,
random_state=42)
ada = AdaBoostClassifier(base_estimator=DecisionTreeClassifier(max_depth=1), n_estimators=50,
random_state=42)
[Link](X_train, y_train)
y_pred = [Link](X_test)
print("Accuracy:", accuracy_score(y_test, y_pred))
Exercises:
1. Explain how AdaBoost focuses on misclassified samples.
2. Modify the number of estimators and observe the accuracy changes.
Lab 7: Naive Bayes Classifier
Objective:
Implement Naive Bayes classifier for categorical data classification.
Code:
from [Link] import load_iris
from sklearn.naive_bayes import GaussianNB
from sklearn.model_selection import train_test_split
from [Link] import accuracy_score
iris = load_iris()
X_train, X_test, y_train, y_test = train_test_split([Link], [Link], test_size=0.3, random_state=0)
gnb = GaussianNB()
[Link](X_train, y_train)
y_pred = [Link](X_test)
print("Accuracy:", accuracy_score(y_test, y_pred))
Exercises:
1. What is the naive assumption in Naive Bayes?
2. Apply MultinomialNB on text data classification.
Lab 8: Neural Networks (Multilayer Perceptron)
Objective:
Implement a simple feedforward neural network using sklearn's MLPClassifier.
Code:
from [Link] import load_iris
from sklearn.neural_network import MLPClassifier
from sklearn.model_selection import train_test_split
from [Link] import accuracy_score
iris = load_iris()
X_train, X_test, y_train, y_test = train_test_split([Link], [Link], test_size=0.3, random_state=1)
mlp = MLPClassifier(hidden_layer_sizes=(10,), max_iter=1000, random_state=1)
[Link](X_train, y_train)
y_pred = [Link](X_test)
print("Accuracy:", accuracy_score(y_test, y_pred))
Exercises:
1. Experiment with different hidden layer sizes.
2. What activation functions are supported by MLPClassifier?
Lab 9: Bayesian Networks (Basic Example)
Objective:
Understand Bayesian Networks using Python library (pgmpy).
Code:
# This example requires pgmpy package
from [Link] import BayesianNetwork
from [Link] import TabularCPD
model = BayesianNetwork([('Burglary', 'Alarm'), ('Earthquake', 'Alarm')])
cpd_burglary = TabularCPD(variable='Burglary', variable_card=2, values=[[0.999], [0.001]])
cpd_earthquake = TabularCPD(variable='Earthquake', variable_card=2, values=[[0.998], [0.002]])
cpd_alarm = TabularCPD(variable='Alarm', variable_card=2,
values=[[0.999, 0.71, 0.06, 0.05],
[0.001, 0.29, 0.94, 0.95]],
evidence=['Burglary', 'Earthquake'],
evidence_card=[2, 2])
model.add_cpds(cpd_burglary, cpd_earthquake, cpd_alarm)
model.check_model()
print("Bayesian Network created successfully.")
Exercises:
1. Explain the structure of this network.
2. Add more nodes to model more complex dependencies.
Lab 10: Support Vector Machine (SVM) Classifier
Objective:
Train an SVM classifier on the Iris dataset.
Code:
from [Link] import load_iris
from [Link] import SVC
from sklearn.model_selection import train_test_split
from [Link] import accuracy_score
iris = load_iris()
X_train, X_test, y_train, y_test = train_test_split([Link], [Link], test_size=0.3,
random_state=42)
svm = SVC(kernel='linear', random_state=42)
[Link](X_train, y_train)
y_pred = [Link](X_test)
print("Accuracy:", accuracy_score(y_test, y_pred))
Exercises:
1. Try different kernels (rbf, poly) and compare results.
2. Explain the concept of margin in SVM.