0% found this document useful (0 votes)
21 views2 pages

Practical04.ipynb - Colab

The document contains a Python script that uses the Naive Bayes algorithm to classify wine quality based on various chemical properties. It loads a dataset, splits it into training and testing sets, trains the model, and evaluates its accuracy, which is found to be 54%. Additionally, a classification report is generated, showing precision, recall, and F1-scores for different quality ratings.

Uploaded by

taklikarkashish0
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views2 pages

Practical04.ipynb - Colab

The document contains a Python script that uses the Naive Bayes algorithm to classify wine quality based on various chemical properties. It loads a dataset, splits it into training and testing sets, trains the model, and evaluates its accuracy, which is found to be 54%. Additionally, a classification report is generated, showing precision, recall, and F1-scores for different quality ratings.

Uploaded by

taklikarkashish0
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

8/17/24, 9:04 PM Untitled5.

ipynb - Colab

import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.naive_bayes import GaussianNB
from [Link] import accuracy_score, classification_report

data = pd.read_csv('winequality_red.csv')

print([Link]())

fixed acidity volatile acidity citric acid residual sugar chlorides \


0 7.4 0.70 0.00 1.9 0.076
1 7.8 0.88 0.00 2.6 0.098
2 7.8 0.76 0.04 2.3 0.092
3 11.2 0.28 0.56 1.9 0.075
4 7.4 0.70 0.00 1.9 0.076

free sulfur dioxide total sulfur dioxide density pH sulphates \


0 11.0 34.0 0.9978 3.51 0.56
1 25.0 67.0 0.9968 3.20 0.68
2 15.0 54.0 0.9970 3.26 0.65
3 17.0 60.0 0.9980 3.16 0.58
4 11.0 34.0 0.9978 3.51 0.56

alcohol quality
0 9.4 5
1 9.8 5
2 9.8 5
3 9.8 6
4 9.4 5

X = [Link][:, :-1] # features


y = [Link][:, -1] # target

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42)

model = GaussianNB()

[Link](X_train, y_train)

▾ GaussianNB
GaussianNB()

y_pred = [Link](X_test)

accuracy = accuracy_score(y_test, y_pred)


print(f"Accuracy: {accuracy:.2f}")

Accuracy: 0.54

report = classification_report(y_test, y_pred)


print(report)

precision recall f1-score support

3 0.00 0.00 0.00 1


4 0.12 0.12 0.12 17
5 0.68 0.62 0.65 195
6 0.52 0.54 0.53 200
7 0.40 0.49 0.44 61
8 0.00 0.00 0.00 6

accuracy 0.54 480


macro avg 0.29 0.29 0.29 480
weighted avg 0.55 0.54 0.54 480

Start coding or generate with AI.

[Link] NHu6n7aqqPn8OIgPTv-73878cL5nvA#scrollTo=gRtQh1az8vLY&printMode=true 1/2


8/17/24, 9:04 PM [Link] - Colab

[Link] NHu6n7aqqPn8OIgPTv-73878cL5nvA#scrollTo=gRtQh1az8vLY&printMode=true 2/2

You might also like