0% found this document useful (0 votes)
105 views65 pages

Good Artificial Intelligence With Python

The document provides a comprehensive guide on Artificial Intelligence (AI) using Python, covering topics such as machine learning, supervised and unsupervised learning, neural networks, and natural language processing. It emphasizes the importance of Python as the preferred programming language for AI development due to its simplicity and extensive libraries. Additionally, it outlines the steps for setting up a Python development environment and includes practical examples and applications of AI in various industries.

Uploaded by

Al Rashidiya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
105 views65 pages

Good Artificial Intelligence With Python

The document provides a comprehensive guide on Artificial Intelligence (AI) using Python, covering topics such as machine learning, supervised and unsupervised learning, neural networks, and natural language processing. It emphasizes the importance of Python as the preferred programming language for AI development due to its simplicity and extensive libraries. Additionally, it outlines the steps for setting up a Python development environment and includes practical examples and applications of AI in various industries.

Uploaded by

Al Rashidiya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

ARTIFICIAL

INTELLIGENCE
WITH PYTHON

Copyright © 2025 Skill Foundry


index
INTRODUCTION ........................................................................................................ 3
Setting Up AI Development Environment with Python ................................................ 7
Understanding Machine Learning — The Heart of AI ............................................... 11
Supervised Learning Deep Dive — Regression and Classification Models.............. 16
Unsupervised Learning Deep Dive — Discovering Hidden Patterns ........................ 21
Neural Networks Fundamentals — Building Brains for AI ........................................ 26
Project — Build a Neural Network to Classify Handwritten Digits ............................ 30
Deep Learning for Image Classification — CNNs Explained .................................... 33
Advanced Image Classification — Transfer Learning ............................................... 37
Natural Language Processing (NLP) Basics with Python ......................................... 41
Spam Detection Using Machine Learning ................................................................ 45
Deep Learning for Text Classification (with NLP) ..................................................... 48
Computer Vision Basics and Image Classification ................................................... 51
AI for Automation: Files, Web, and Emails ............................................................... 56
AI Chatbots and Virtual Assistants ........................................................................... 61

Copyright © 2025 Skill Foundry


INTRODUCTION

What is Artificial Intelligence?

Artificial Intelligence (AI) is a field of computer science that aims to create machines
capable of performing tasks that typically require human intelligence. These tasks
include learning, reasoning, problem-solving, perception, language understanding,
and even creativity.

In simple words:
AI is about making computers think, learn, and act intelligently, just like humans.

Core Objectives of AI

 Automation: Automating tasks that are repetitive or complex.

 Learning: Enabling machines to learn from experience (data).

 Reasoning: Making logical deductions and solving problems.

 Perception: Interpreting the world through images, sounds, and sensors.

 Interaction: Communicating and collaborating with humans naturally.

A Short History of AI

Year Milestone

Alan Turing introduces the "Turing Test" to assess a machine's ability to


1950
exhibit intelligent behavior.

The term "Artificial Intelligence" is coined by John McCarthy at the


1956
Dartmouth Conference.

1980s Rise of Expert Systems (programs that mimic human experts).

Deep learning gains popularity after a major breakthrough in image


2012
recognition.

AI becomes mainstream — self-driving cars, AI doctors, AI-generated art,


2020+
and intelligent personal assistants.

Copyright © 2025 Skill Foundry


Real-World Applications of AI

AI is not science fiction anymore; it is part of our daily lives.

Industry AI Applications

Healthcare Disease prediction, robotic surgeries, drug discovery

Finance Fraud detection, automated trading, credit scoring

Retail Customer behavior prediction, inventory management

Entertainment Movie and music recommendations (Netflix, Spotify)

Transportation Self-driving cars, traffic prediction, route optimization

Education Personalized learning, AI tutors, automated grading

Categories of Artificial Intelligence

AI can be classified into three broad categories based on its capabilities:

Type Description

Narrow AI AI that is specialized in one task (e.g., Alexa, Siri)

General AI AI that can perform any intellectual task a human can

Super AI Hypothetical AI that surpasses human intelligence

🔥 Fact: Currently, we are only at the Narrow AI stage.

How AI Systems Work (High-Level View)

AI systems typically follow these steps:

1. Collect Data: Gather relevant data (images, texts, numbers).

2. Train Models: Teach machines using the data.

3. Make Predictions: Use the trained model to make decisions.

4. Improve Over Time: Continuously learn from new data.

Copyright © 2025 Skill Foundry


Example:
An AI system that predicts house prices would collect data (house size, location,
price), learn patterns, and then predict prices for new houses.

Why Python is the Best Language for AI

Python is the most popular programming language for AI and Machine Learning
because:

 Simple and readable syntax (easy to focus on logic, not language details)

 Vast number of libraries (like TensorFlow, Keras, scikit-learn, OpenCV)

 Active community support (forums, tutorials, open-source projects)

 Integration capabilities (with C/C++, Java, or web applications)

Popular Python Libraries for AI Development

Library Purpose

NumPy Handling numerical operations and arrays

Pandas Data manipulation and analysis

Matplotlib Visualization of data (plots and charts)

scikit-learn Machine learning models and algorithms

TensorFlow Building and training deep learning models

OpenCV Image processing and computer vision tasks

NLTK, spaCy Natural language processing (text analysis)

Quick Example: How AI Learns from Data


Let's imagine you want an AI to recognize apples and bananas:

 You show the AI 1,000 pictures of apples and 1,000 pictures of bananas.

 The AI analyzes colors, shapes, textures, and learns what makes an apple
different from a banana.

 Later, when shown a new image, the AI predicts whether it’s an apple or a
banana.

Copyright © 2025 Skill Foundry


This is the power of learning from data instead of hardcoding rules manually.

Fun Fact:

The human brain has around 86 billion neurons. Deep learning tries to mimic the
way these neurons work — but even the best AI models today are still far simpler
than the human brain!

Summary

 AI is about making machines behave intelligently.


 Real-world AI is already present in healthcare, banking, education,
entertainment, and more.

 Python is the number one language for AI development.


 AI systems learn from data, improve over time, and make decisions.

Copyright © 2025 Skill Foundry


Setting Up AI Development Environment with
Python

Why Environment Setup is Crucial

Before we dive deep into AI coding, it’s critical to set up a professional Python
environment.
A well-organized setup will:

 Avoid errors and version conflicts

 Make your projects scalable and maintainable

 Save hours of debugging later

Step 1: Install Python (Latest Stable Version)

Python Download Page:


👉 https://www.python.org/downloads/

Recommended Version:
 Python 3.10 or above

Installation Tips:

 During installation, check the box "Add Python to PATH" — this makes
Python accessible from anywhere in your system.

Step 2: Install an Integrated Development Environment (IDE)

A good IDE helps you write clean, error-free code faster.

IDE Why Use It?

Visual Studio Code Lightweight, highly customizable, free

PyCharm Best for Python projects, intelligent suggestions

Jupyter Notebook Great for data exploration and experiments

Copyright © 2025 Skill Foundry


Recommendation:
Start with Visual Studio Code (VS Code) — it’s beginner-friendly and powerful.

👉 Download VS Code here: https://code.visualstudio.com/

Step 3: Install Key Python Packages

After installing Python and your IDE, open a terminal (or command prompt) and
install essential AI libraries:
pip install numpy pandas matplotlib scikit-learn jupyter
notebook seaborn
Package Overview:

 numpy: Math and array operations

 pandas: Data handling


 matplotlib: Graphs and charts

 scikit-learn: Machine learning models

 jupyter notebook: For interactive AI coding

 seaborn: Beautiful data visualizations

Step 4: Set Up Virtual Environments (Best Practice)

When working on multiple AI projects, different projects might require different


versions of libraries.
This is where virtual environments come in.

How to create a virtual environment:


# Install virtualenv if not already installed
pip install virtualenv

# Create a new virtual environment


virtualenv myenv

# Activate the environment


# Windows:
myenv\Scripts\activate
# macOS/Linux:
source myenv/bin/activate
Now, when you install packages, they stay inside this isolated environment.

Copyright © 2025 Skill Foundry


Step 5: Install Jupyter Notebook for Interactive Coding

Jupyter Notebook is excellent for:

 Running code in small cells

 Writing notes and explanations alongside your code


 Visualizing data immediately

To launch Jupyter Notebook:


jupyter notebook
It will open in your web browser.
You can now create .ipynb files to code your AI experiments.

Step 6: Folder Structure for AI Projects

Organizing your project folders properly will make your life easier.

Recommended folder structure:

my_ai_project/

├── data/ # Datasets go here

├── notebooks/ # Jupyter notebooks

├── models/ # Saved machine learning models

├── scripts/ # Python scripts (.py files)

├── README.md # Project overview

└── requirements.txt # List of project packages

Step 7: Save Requirements for Easy Setup


Once you install all necessary packages inside your environment, save them for
future reference.
pip freeze > requirements.txt
This generates a file listing all installed libraries.
Later, anyone (including you) can install them using:
pip install -r requirements.txt

Copyright © 2025 Skill Foundry


Bonus Tip: Install Extensions for Visual Studio Code

Boost your coding experience by installing these extensions in VS Code:

 Python (Microsoft)

 Jupyter (Microsoft)
 Pylance (for fast IntelliSense)

 Code Runner (to quickly run Python files)

Troubleshooting Common Installation Problems

Problem Solution

Reinstall Python and ensure "Add to PATH" is


"pip command not found"
checked

"ModuleNotFoundError" Install the missing library using pip

Jupyter Notebook won't


Reinstall using pip install notebook and retry
launch

Summary

 Python 3.10+ should be installed correctly with PATH.

 Use Visual Studio Code + Jupyter Notebook for a smooth workflow.


 Always use virtual environments to manage project dependencies.

 Organize your folders properly for professional projects.

 Save requirements.txt files to replicate environments easily.

Copyright © 2025 Skill Foundry


Understanding Machine Learning — The Heart of
AI

What is Machine Learning?

Machine Learning (ML) is a subset of Artificial Intelligence that focuses on building


systems that can learn from data rather than being explicitly programmed.

In simple words:
Instead of writing code to tell a computer how to perform a task, we give it data and
let it figure out the best way to do the task by itself.

How Machine Learning Works (In a Nutshell)

1. Input Data → Give the machine a lot of examples (data).

2. Train Model → The machine finds patterns inside the data.

3. Make Predictions → The trained model predicts outcomes for new data.

Real-World Examples of Machine Learning

Task Machine Learning Application

Email Filtering Spam or not spam detection

Online Shopping Product recommendation engines

Banking Fraud transaction detection

Healthcare Disease diagnosis from scans

Social Media Personalized content feeds

Three Types of Machine Learning

Type Description Example

Learn from labeled data (input →


Supervised Learning Email spam detection
output)

Copyright © 2025 Skill Foundry


Type Description Example

Unsupervised Find hidden patterns in unlabeled Customer


Learning data segmentation

Reinforcement Training a robot to


Learn through rewards and penalties
Learning walk

1. Supervised Learning

 Input: Data + Correct answers (labels)


 Goal: Learn a mapping from input to output

 Common Algorithms: Linear Regression, Decision Trees, Support Vector


Machines

Example:
Given 10,000 house sale records (size, location, price), predict the price of a new
house.

2. Unsupervised Learning

 Input: Only data (no labels)

 Goal: Discover hidden patterns or groupings

 Common Algorithms: K-Means Clustering, PCA (Principal Component


Analysis)

Example:
Group customers into different segments based on their buying habits — without any
predefined categories.

3. Reinforcement Learning

 Input: Agent interacts with environment

 Goal: Maximize cumulative rewards

 Common Algorithms: Q-Learning, Deep Q Networks

Example:
Teaching a self-driving car to stay on the road by giving it a "reward" for every
successful move.

Copyright © 2025 Skill Foundry


Important Terms You Should Know

Term Meaning

Training
Data used to teach the machine
Data

Testing Data New data used to test the machine's performance

Model Mathematical structure that makes predictions

Features Input variables used for prediction (e.g., size of a house)

Labels Correct outputs (e.g., price of a house)

When a model learns too much from training data and fails on new
Overfitting
data

Underfitting When a model fails to capture patterns from data

Visual Understanding of Machine Learning


Imagine you're learning to recognize apples vs bananas:

 Training Phase:
You see 1000 images — you learn bananas are yellow, apples are often red
or green.

 Testing Phase:
Someone shows you a fruit you’ve never seen before. Based on your
learning, you guess if it’s an apple or banana.

That’s exactly how ML models behave!

Key Machine Learning Algorithms You Will Learn

Algorithm Used For

Linear Regression Predicting continuous values (house prices)

Logistic Regression Binary classification (spam or not)

Easy-to-understand models for classification and


Decision Trees
regression

K-Nearest Neighbors (KNN) Classification based on closest examples

Copyright © 2025 Skill Foundry


Algorithm Used For

Support Vector Machine


Powerful for high-dimensional classification
(SVM)

K-Means Clustering Grouping data into clusters without labels

Quick Python Example: Basic Machine Learning Flow

Here’s a tiny example using scikit-learn to predict a simple pattern.


from sklearn.linear_model import LinearRegression
import numpy as np

# Example Data
X = np.array([[1], [2], [3], [4], [5]]) # Features
y = np.array([2, 4, 6, 8, 10]) # Labels

# Create and train model


model = LinearRegression()
model.fit(X, y)

# Predict for a new input


prediction = model.predict([[6]])
print("Prediction for input 6:", prediction)
Output:
Prediction for input 6: [12.]
This means the model learned the pattern y = 2x automatically!

Myths vs Reality About Machine Learning

Myth Reality

"ML is only for geniuses" Anyone who practices can learn it

"You need huge data always" Many useful models work with small datasets

"ML models are always right" Models are just guesses — they can be wrong too!

Summary

 Machine Learning allows machines to learn from data instead of being


hardcoded.

Copyright © 2025 Skill Foundry


 There are three types: Supervised, Unsupervised, Reinforcement Learning.

 Understanding the data, features, and model selection is the key to success.

 Python makes it very easy to implement basic ML models using scikit-learn.

Copyright © 2025 Skill Foundry


Supervised Learning Deep Dive — Regression
and Classification Models

What is Supervised Learning?

Supervised Learning is when a machine learns from labeled examples — meaning,


each training input has a known correct output.

Simple Example:
You give a model data of houses (size, number of rooms) along with their prices.
The model then learns how size and rooms affect price, and later predicts prices for
new houses.

Two Major Types of Supervised Learning

Type What It Does Example

Regression Predicts continuous numbers Predicting house prices

Classification Predicts categories or classes Predicting if email is spam or not

Part 1: Regression — Predicting Continuous Values

What is Regression?

Regression is about predicting a number based on input features.

Example:
Predicting:

 House price based on size

 Salary based on years of experience

 Temperature tomorrow based on past weather data

Most Common Regression Model: Linear Regression

Understanding Linear Regression

Copyright © 2025 Skill Foundry


Linear Regression finds a straight-line relationship between the input variable(s) and
the output.

Simple formula:
y = mx + c
Where:

 y is the prediction (output)

 m is the slope (how much y changes with x)


 x is the input (feature)

 c is the intercept (where the line crosses the y-axis)

Python Example: Linear Regression

Let's predict salary based on years of experience!


import numpy as np
from sklearn.linear_model import LinearRegression

# Example data: years of experience


X = np.array([[1], [2], [3], [4], [5]])
# Corresponding salary (in thousands)
y = np.array([30, 35, 40, 45, 50])

# Create model
model = LinearRegression()
# Train model
model.fit(X, y)

# Predict salary for 6 years experience


predicted_salary = model.predict([[6]])
print(f"Predicted Salary: {predicted_salary[0]}K")
Output:
Predicted Salary: 55.0K

Visualizing Linear Regression


Plotting the data points and regression line:
import matplotlib.pyplot as plt

plt.scatter(X, y, color='blue') # Data points


plt.plot(X, model.predict(X), color='red') # Regression line
plt.title('Experience vs Salary')

Copyright © 2025 Skill Foundry


plt.xlabel('Years of Experience')
plt.ylabel('Salary (K)')
plt.show()

Evaluation Metrics for Regression

Metric Meaning

Mean Absolute Error Average absolute difference between predictions and


(MAE) actual values

Mean Squared Error


Average of squared differences (penalizes bigger errors)
(MSE)

How well the model explains the variability (closer to 1 is


R² Score
better)

Part 2: Classification — Predicting Categories

What is Classification?

Classification is about predicting a label or category for input data.

Example:

 Predict whether an email is spam or not


 Predict whether a tumor is malignant or benign

 Predict if a customer will buy or not

Most Common Classification Model: Logistic Regression

Understanding Logistic Regression

Important: Despite its name, Logistic Regression is used for classification tasks!

It predicts probabilities between 0 and 1 using a sigmoid curve.


If probability > 0.5: Predict Class 1
If probability < 0.5: Predict Class 0

Python Example: Logistic Regression

Copyright © 2025 Skill Foundry


Predict whether a student passes an exam based on hours studied:
from sklearn.linear_model import LogisticRegression
import numpy as np

# Example data
X = np.array([[1], [2], [3], [4], [5]])
# 0 = Fail, 1 = Pass
y = np.array([0, 0, 0, 1, 1])

# Create model
model = LogisticRegression()
# Train model
model.fit(X, y)

# Predict for 6 hours studied


prediction = model.predict([[6]])
print(f"Prediction (0 = Fail, 1 = Pass): {prediction[0]}")
Output:
Prediction (0 = Fail, 1 = Pass): 1

Visualizing Logistic Regression

Plotting the sigmoid curve:


import matplotlib.pyplot as plt

# Scatter plot of actual data


plt.scatter(X, y, color='blue')

# Plot the logistic curve


import numpy as np
X_test = np.linspace(0, 7, 300)
y_prob = model.predict_proba(X_test.reshape(-1,1))[:,1]

plt.plot(X_test, y_prob, color='red')


plt.title('Hours Studied vs Passing Probability')
plt.xlabel('Hours Studied')
plt.ylabel('Probability of Passing')
plt.show()

Evaluation Metrics for Classification

Metric Meaning

Accuracy Percentage of correct predictions

Copyright © 2025 Skill Foundry


Metric Meaning

Precision Correct positive predictions out of total predicted positives

Recall Correct positive predictions out of all actual positives

F1 Score Harmonic mean of precision and recall

Confusion Matrix Table showing TP, TN, FP, FN

When to Use Regression vs Classification?

Goal Use

Predicting a continuous number Regression

Predicting a category or label Classification

Summary

 Supervised Learning = training with labeled data.


 Regression predicts continuous values (e.g., price, salary).

 Classification predicts categories (e.g., spam or not).

 Linear Regression for regression problems, Logistic Regression for


classification problems.

 Model evaluation is crucial to assess performance.

Copyright © 2025 Skill Foundry


Unsupervised Learning Deep Dive — Discovering
Hidden Patterns

What is Unsupervised Learning?

In Unsupervised Learning, the model is given unlabeled data — meaning, no


correct answers are provided.
The model tries to find patterns, structures, or groupings within the data on its
own.

Simple Real-Life Examples:

 Grouping customers based on purchasing behavior.

 Organizing similar news articles together.

 Detecting frauds or unusual patterns in bank transactions.

Key Techniques in Unsupervised Learning

Technique What It Does Example

Group similar data points Grouping customers into


Clustering
into clusters segments

Dimensionality Simplify data by reducing Compressing images, speeding


Reduction features up models

Part 1: Clustering — Grouping Data

What is Clustering?

Clustering is about grouping similar things together without being told which
group is correct.

Example:
Grouping customers based on age and spending habits without telling the model in
advance who belongs where.

Most Common Clustering Algorithm: K-Means Clustering

Copyright © 2025 Skill Foundry


Understanding K-Means Clustering

K-Means algorithm tries to partition data into K clusters where each data point
belongs to the cluster with the nearest mean.

How K-Means works:

1. Choose number of clusters (K).

2. Randomly place K cluster centers.


3. Assign each point to the nearest cluster center.

4. Move cluster centers to the average position of points assigned to them.

5. Repeat steps 3-4 until centers stabilize.

Visual Example

Imagine you have points scattered on a sheet.


K-Means will:

 Form groups of close points together.

 Place a center inside each group.

Python Example: K-Means Clustering

Let's group customers based on their age and spending score!


import numpy as np
from sklearn.cluster import KMeans
import matplotlib.pyplot as plt

# Example customer data: [Age, Spending Score]


X = np.array([
[25, 50],
[30, 60],
[22, 55],
[40, 90],
[42, 100],
[41, 95],
[20, 20],
[21, 25],
[23, 18]
])

# Create KMeans model with 3 clusters

Copyright © 2025 Skill Foundry


kmeans = KMeans(n_clusters=3, random_state=0)
kmeans.fit(X)

# Predict cluster for each data point


labels = kmeans.labels_

# Plot clusters
plt.scatter(X[:, 0], X[:, 1], c=labels, cmap='viridis')
plt.scatter(kmeans.cluster_centers_[:,0],
kmeans.cluster_centers_[:,1], color='red', marker='X', s=200)
plt.title('Customer Segments')
plt.xlabel('Age')
plt.ylabel('Spending Score')
plt.show()

How to Choose the Best K?

Use the Elbow Method:

 Try different values of K (e.g., 1 to 10).

 Plot K vs Within-Cluster Sum of Squares (WCSS).

 Look for a point where adding another cluster doesn't improve much — the
"elbow".

Elbow Method Example:


wcss = []
for k in range(1, 11):
kmeans = KMeans(n_clusters=k, random_state=0)
kmeans.fit(X)
wcss.append(kmeans.inertia_)

plt.plot(range(1, 11), wcss)


plt.title('Elbow Method')
plt.xlabel('Number of clusters (K)')
plt.ylabel('WCSS')
plt.show()
Look for the "bend" in the curve — that's your ideal K!

Part 2: Dimensionality Reduction — Simplifying Data

What is Dimensionality Reduction?

Copyright © 2025 Skill Foundry


It means reducing the number of input features while preserving as much
information as possible.

Example:
Reducing a 1000-feature image into just 2 or 3 features for visualization.

Why Do We Need Dimensionality Reduction?

 Makes visualization possible (2D or 3D).

 Speeds up machine learning models.

 Removes noise from the data.


 Prevents overfitting.

Most Common Technique: Principal Component Analysis (PCA)

Understanding PCA

PCA transforms the data into a new set of dimensions (called principal components)
that best capture the variance (spread) in the data.

 First Principal Component: The direction with maximum variance.

 Second Principal Component: Perpendicular to the first, with next highest


variance.

Python Example: PCA


Let's reduce customer data from 2D to 2D (for demonstration):
from sklearn.decomposition import PCA

# Using the same customer data X


pca = PCA(n_components=2)
X_pca = pca.fit_transform(X)

print(X_pca)
PCA is more powerful when dealing with higher dimensions (e.g., 100s of
features).

Summary

Copyright © 2025 Skill Foundry


 Unsupervised Learning finds patterns in unlabeled data.

 Clustering groups similar points; K-Means is the most famous method.

 The Elbow Method helps find the best number of clusters.

 Dimensionality Reduction simplifies data, makes visualization easier.


 PCA is a powerful tool to reduce the number of features.

Copyright © 2025 Skill Foundry


Neural Networks Fundamentals — Building Brains
for AI

What is a Neural Network?

A Neural Network is a computer system inspired by the human brain.


It tries to learn patterns from data by simulating how a brain's neurons work.

Think of it like layers of math operations that gradually learn to predict or classify
things.

Real-Life Examples of Neural Networks:

 Recognizing faces in photos.


 Translating languages automatically.

 Predicting stock market trends.


 Identifying spam emails.

Anatomy of a Neural Network

A basic Neural Network has:

Part What It Does

Input Layer Receives the raw data

Hidden Layers Processes and transforms the data

Output Layer Gives the final result

Visual Structure

Input Layer ➔ Hidden Layer(s) ➔ Output Layer


Each layer is made up of neurons (also called nodes).

Understanding a Single Neuron


A Neuron in a network:

 Takes inputs (numbers).

Copyright © 2025 Skill Foundry


 Multiplies each input by a weight.

 Adds a bias.

 Applies an activation function to decide the output.

Simple Formula:
Output = ActivationFunction( (Weight1 × Input1) + (Weight2 ×
Input2) + ... + Bias )

Activation Functions
Activation functions decide if a neuron should fire (activate) or not.

Common types:

Activation Function Shape Purpose

Sigmoid S-curve Good for probabilities

ReLU (Rectified Linear Unit) Linear for positive values Very fast and popular

Tanh S-curve (centered at 0) Used in old networks

Most popular today:


✅ ReLU for hidden layers
✅ Sigmoid or Softmax for output layers

Building Your First Neural Network with Python (Using Keras)


We will use TensorFlow and Keras, which make building neural networks very easy.

Install libraries if you haven't already:


pip install tensorflow

Python Example: Basic Neural Network

Suppose you want to predict if a student will pass or fail based on hours studied.

import tensorflow as tf
from tensorflow import keras
import numpy as np

Copyright © 2025 Skill Foundry


# Input data (hours studied)
X = np.array([[1], [2], [3], [4], [5]], dtype=float)

# Output labels (0 = Fail, 1 = Pass)


y = np.array([[0], [0], [0], [1], [1]], dtype=float)

# Build a simple model


model = keras.Sequential([
keras.layers.Dense(units=1, input_shape=[1],
activation='sigmoid')
])

# Compile the model


model.compile(optimizer='adam', loss='binary_crossentropy',
metrics=['accuracy'])

# Train the model


model.fit(X, y, epochs=500)

# Test the model


print(model.predict(np.array([[1.5], [3.5]])))

Understanding This Code:

 Dense Layer: Fully connected layer of neurons.

 Activation: Sigmoid (because output is 0 or 1).

 Loss Function: Binary Cross-Entropy (good for binary classification).

 Optimizer: Adam (advanced optimization algorithm).

How Neural Networks Learn?


They learn through a process called Backpropagation:

 Calculate how wrong the prediction is (error).

 Adjust the weights and biases slightly to reduce the error.

 Repeat this process many times (epochs).

Important Terms to Know:

Term Meaning

Epoch One complete pass through the training data

Copyright © 2025 Skill Foundry


Term Meaning

Loss How wrong the model's prediction is

Optimizer Algorithm that updates weights

Learning Rate How big the weight adjustments are

Summary

 Neural Networks simulate the brain to learn patterns from data.

 They consist of input, hidden, and output layers.

 Neurons perform weighted math operations and use activation functions.

 We can build simple neural networks easily using TensorFlow and Keras.
 Neural networks learn by adjusting weights using backpropagation.

Copyright © 2025 Skill Foundry


Project — Build a Neural Network to Classify
Handwritten Digits
Goal of This Project

We will build a neural network that can recognize handwritten digits (0–9) using
the famous MNIST dataset.

What is the MNIST Dataset?

The MNIST dataset is a collection of 70,000 grayscale images of handwritten digits


from 0 to 9:

 Each image is 28x28 pixels (i.e., 784 features).

 It’s a classic dataset used to test and learn image classification techniques.

What We'll Learn:


 How to load and preprocess image data.

 How to build a multi-layer neural network.

 How to train, evaluate, and test the model.

 How to predict custom digits using Python.

Step 1: Import Required Libraries


import tensorflow as tf
from tensorflow import keras
import matplotlib.pyplot as plt

Step 2: Load the MNIST Dataset

Keras gives us MNIST built-in — no download required.


# Load the dataset
(x_train, y_train), (x_test, y_test) =
keras.datasets.mnist.load_data()

Step 3: Visualize the Data

Copyright © 2025 Skill Foundry


Let's see the first image:
plt.imshow(x_train[0], cmap='gray')
plt.title(f"Label: {y_train[0]}")
plt.show()

Step 4: Normalize the Data

Always normalize image pixel values to make the model learn better.
x_train = x_train / 255.0
x_test = x_test / 255.0

Step 5: Build the Neural Network Model

We will use:
 An input layer that flattens 28x28 pixels into 784.

 Two hidden layers with ReLU.

 An output layer with 10 neurons (for 0–9) using Softmax.


model = keras.Sequential([
keras.layers.Flatten(input_shape=(28, 28)),
keras.layers.Dense(128, activation='relu'),
keras.layers.Dense(64, activation='relu'),
keras.layers.Dense(10, activation='softmax') # 10 output
classes
])

Step 6: Compile the Model


model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])

Step 7: Train the Model


model.fit(x_train, y_train, epochs=5)
Tip: You can increase epochs to improve accuracy if needed.

Step 8: Evaluate the Model on Test Data

Copyright © 2025 Skill Foundry


test_loss, test_accuracy = model.evaluate(x_test, y_test)
print(f"Test accuracy: {test_accuracy}")

Step 9: Make Predictions


predictions = model.predict(x_test)

# Predict the first test image


import numpy as np
predicted_label = np.argmax(predictions[0])
print(f"Predicted: {predicted_label}")
plt.imshow(x_test[0], cmap='gray')
plt.show()

Step 10: Predict Your Own Digit (Bonus)

You can draw a digit using any drawing app, resize it to 28x28 pixels, and use it like
this:
from PIL import Image
import numpy as np

# Load your own digit image (must be 28x28 and grayscale)


image = Image.open('my_digit.png').convert('L')
image = image.resize((28, 28))
image_array = np.array(image) / 255.0

# Reshape for model input


image_array = image_array.reshape(1, 28, 28)

# Predict
prediction = model.predict(image_array)
print("Predicted Digit:", np.argmax(prediction))

Summary

 You just built your first AI project using real-world data!

 You used TensorFlow + Keras to create a neural network.

 You trained it to recognize digits with high accuracy.

 You learned how to visualize predictions and test custom images.

Copyright © 2025 Skill Foundry


Deep Learning for Image Classification — CNNs
Explained

What is a Convolutional Neural Network (CNN)?

A Convolutional Neural Network (CNN) is a specialized type of neural network


used for processing grid-like data such as images. CNNs are powerful because
they can automatically learn spatial hierarchies of features (like edges, textures, and
shapes) from images.

Why CNNs for Image Classification?

Traditional neural networks (fully connected networks) struggle with images because
the number of parameters increases dramatically. CNNs overcome this by using
shared weights and local receptive fields, making them efficient and scalable for
image tasks.

CNN Layers — A Breakdown


A CNN typically consists of several types of layers:

1. Convolutional Layers:
These layers perform a convolution operation, using filters (also called
kernels) to detect features such as edges, textures, and shapes.

2. Pooling Layers:
These layers reduce the dimensions of the data, keeping only the most
important information. Max Pooling is the most common method.

3. Fully Connected (FC) Layers:


These are traditional neural network layers that classify the features extracted
by the convolution and pooling layers.

4. Activation Functions:
Just like in a normal neural network, CNNs use ReLU or Softmax to introduce
non-linearity.

CNN Architecture

A typical CNN consists of alternating layers of Convolution and Pooling followed by


one or more Fully Connected Layers.

Copyright © 2025 Skill Foundry


Input Image ➔ Convolution Layer ➔ Pooling Layer ➔
Convolution Layer ➔ Pooling Layer ➔ Fully Connected Layer ➔
Output

Example: Building a Simple CNN for Image Classification

Let’s create a simple CNN for classifying images from the MNIST dataset.

Step 1: Import Libraries


import tensorflow as tf
from tensorflow import keras
import matplotlib.pyplot as plt

Step 2: Load and Prepare the Data


We’ll use the MNIST dataset again, but this time we'll reshape the data for a CNN.
# Load the dataset
(x_train, y_train), (x_test, y_test) =
keras.datasets.mnist.load_data()

# Normalize the data


x_train = x_train / 255.0
x_test = x_test / 255.0

# Reshape for CNN input (28x28x1) -> 1 color channel


(grayscale)
x_train = x_train.reshape(-1, 28, 28, 1)
x_test = x_test.reshape(-1, 28, 28, 1)

Step 3: Build the CNN Model

Now, let’s define the layers of the CNN.


model = keras.Sequential([
keras.layers.Conv2D(32, (3, 3), activation='relu',
input_shape=(28, 28, 1)),
keras.layers.MaxPooling2D((2, 2)),
keras.layers.Conv2D(64, (3, 3), activation='relu'),
keras.layers.MaxPooling2D((2, 2)),
keras.layers.Flatten(),
keras.layers.Dense(64, activation='relu'),
keras.layers.Dense(10, activation='softmax') # 10 output
classes
])

Copyright © 2025 Skill Foundry


Step 4: Compile the Model

We’ll use Adam optimizer and Sparse Categorical Crossentropy for multi-class
classification.
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])

Step 5: Train the Model

Let’s train the CNN model on the MNIST data.


model.fit(x_train, y_train, epochs=5)

Step 6: Evaluate the Model

We’ll check how well the model performs on the test data.
test_loss, test_accuracy = model.evaluate(x_test, y_test)
print(f"Test accuracy: {test_accuracy}")

Step 7: Make Predictions


Now we can use the trained model to predict some digits.

predictions = model.predict(x_test)

# Predict the first test image


predicted_label = np.argmax(predictions[0])
print(f"Predicted: {predicted_label}")
plt.imshow(x_test[0], cmap='gray')
plt.title(f"Predicted: {predicted_label}")
plt.show()

Key Concepts in CNNs:

1. Convolution: This operation extracts features (like edges, corners, textures).

2. Max Pooling: This reduces the image size by taking the maximum value in a
region, keeping important features.
3. Flattening: Converts the 2D image data into a 1D vector to feed into fully
connected layers.

Copyright © 2025 Skill Foundry


4. Fully Connected Layers: Use the extracted features to make final
classifications.

Summary

 CNNs are specialized networks for processing images, recognizing patterns


like edges and textures.

 They are made of Convolutional layers, Pooling layers, and Fully


Connected layers.

 CNNs are efficient for image classification tasks because they reduce the
number of parameters compared to fully connected networks.

 You built your first CNN model to classify MNIST digits and achieved high
accuracy!

Copyright © 2025 Skill Foundry


Advanced Image Classification — Transfer
Learning

What is Transfer Learning?

Transfer Learning is a powerful technique where we reuse a pre-trained model —


a model that has been already trained on a large dataset — and adapt it to a new,
related task.

Instead of starting from scratch, we transfer the learning from a large, general
dataset (like ImageNet with millions of images) to solve a smaller, specific problem.

Why Use Transfer Learning?

 Saves Time and Resources: Training a deep neural network from scratch
can take days or weeks. Transfer learning allows you to get great results in
just a few minutes or hours.

 Better Accuracy: Pre-trained models have learned powerful feature


representations that help improve accuracy on small datasets.
 Less Data Needed: You don’t need millions of labeled examples anymore!

Popular Pre-trained Models

Here are some popular models often used for transfer learning:

 VGG16

 ResNet50

 InceptionV3
 MobileNet

These models have been trained on massive datasets like ImageNet (over 14 million
images!).

How Does Transfer Learning Work?

Usually, you follow these two steps:

1. Freeze the convolutional base (pre-trained part) so its weights are not
updated.

Copyright © 2025 Skill Foundry


2. Add and train new layers on top to fit your specific task.

Visual:

Pre-trained Layers (Frozen) ➔ New Layers (Trainable) ➔ Output

Example: Using VGG16 for Image Classification

We’ll use the VGG16 model pre-trained on ImageNet and adapt it to classify images.

Step 1: Import Libraries


import tensorflow as tf
from tensorflow import keras
from tensorflow.keras import layers, models

Step 2: Load Pre-trained VGG16 Model

We load VGG16 without the top classification layer, keeping only the convolutional
base.
# Load VGG16 without top layer
base_model = keras.applications.VGG16(weights='imagenet',
include_top=False, input_shape=(150, 150, 3))

# Freeze the base model


base_model.trainable = False

Step 3: Add Custom Layers


Now, let’s add new trainable layers on top.
model = models.Sequential([
base_model,
layers.Flatten(),
layers.Dense(256, activation='relu'),
layers.Dense(1, activation='sigmoid') # Binary
classification (can be changed)
])

Step 4: Compile the Model


model.compile(optimizer='adam',
loss='binary_crossentropy',
metrics=['accuracy'])

Copyright © 2025 Skill Foundry


Step 5: Prepare Your Dataset

You should have your images organized like this:


dataset/
train/
class1/
class2/
validation/
class1/
class2/
We’ll load them using ImageDataGenerator.
from tensorflow.keras.preprocessing.image import
ImageDataGenerator

train_datagen = ImageDataGenerator(rescale=1./255)
val_datagen = ImageDataGenerator(rescale=1./255)

train_generator = train_datagen.flow_from_directory(
'dataset/train/',
target_size=(150, 150),
batch_size=32,
class_mode='binary'
)

val_generator = val_datagen.flow_from_directory(
'dataset/validation/',
target_size=(150, 150),
batch_size=32,
class_mode='binary'
)

Step 6: Train the Model

history = model.fit(
train_generator,
epochs=5,
validation_data=val_generator
)

Step 7: Evaluate and Fine-tune (Optional)


Once the custom layers are trained, you can unfreeze some of the top layers of the
base model to fine-tune.

Copyright © 2025 Skill Foundry


base_model.trainable = True

# Re-compile the model with a very low learning rate


model.compile(optimizer=keras.optimizers.Adam(learning_rate=1e
-5),
loss='binary_crossentropy',
metrics=['accuracy'])

# Fine-tune
model.fit(
train_generator,
epochs=3,
validation_data=val_generator
)

Important Tips for Transfer Learning:

 Always start by freezing the base model first.

 Train only the custom layers initially.

 Later, fine-tune the base model with a very small learning rate to avoid
destroying its pre-trained weights.

Summary

 Transfer Learning allows you to reuse powerful pre-trained models and


adapt them for your tasks.

 It saves huge amounts of time and delivers better results with less data.

 You built an image classification system using VGG16 pre-trained on


ImageNet.

 You learned how to freeze, add new layers, train, and even fine-tune
models.

Copyright © 2025 Skill Foundry


Natural Language Processing (NLP) Basics with
Python

What is Natural Language Processing (NLP)?

Natural Language Processing (NLP) is a branch of Artificial Intelligence that helps


machines understand, interpret, and generate human language.

With NLP, computers can:

 Understand emails, chat messages, articles.

 Translate languages (like Google Translate).

 Analyze customer reviews.

 Generate human-like text.

Common NLP Tasks

Here are some popular tasks in NLP:

 Text Classification: Classify emails into spam or not spam.

 Sentiment Analysis: Detect if a review is positive, negative, or neutral.

 Machine Translation: Translate text between languages.

 Chatbots: Build conversational systems that can chat with users.


 Summarization: Automatically create a summary of a long article.

NLP in Real Life


 Siri and Alexa understanding your commands.

 Grammarly correcting your grammar.

 Google Search suggesting relevant results.

 Netflix recommending movies based on reviews.

Step 1: Install Necessary Libraries

We'll use NLTK — a popular Python library for basic NLP tasks.
Install it using pip:

Copyright © 2025 Skill Foundry


pip install nltk

Step 2: Basic NLP Operations with NLTK

Let’s learn some core NLP concepts: Tokenization, Stopwords Removal, and
Stemming.

2.1 Tokenization
Tokenization is the process of breaking text into smaller parts (tokens) like words
or sentences.

Example:
import nltk
nltk.download('punkt') # Download the tokenizer models

from nltk.tokenize import word_tokenize

text = "Python is awesome! Let's learn AI."


tokens = word_tokenize(text)

print(tokens)
Output:
['Python', 'is', 'awesome', '!', 'Let', "'s", 'learn', 'AI',
'.']

2.2 Stopwords Removal


Stopwords are very common words (like "is", "the", "a") that carry little meaning.

Removing them makes text analysis more meaningful.

nltk.download('stopwords')
from nltk.corpus import stopwords

stop_words = set(stopwords.words('english'))
filtered_tokens = [word for word in tokens if word.lower() not
in stop_words]

print(filtered_tokens)

Output:

Copyright © 2025 Skill Foundry


['Python', 'awesome', 'Let', "'s", 'learn', 'AI', '.']

2.3 Stemming

Stemming reduces words to their root form.

Example: "playing", "played", and "plays" become "play".

from nltk.stem import PorterStemmer

stemmer = PorterStemmer()

words = ["playing", "played", "plays", "player"]


stems = [stemmer.stem(word) for word in words]

print(stems)
Output:
['play', 'play', 'play', 'player']

Step 3: Build a Simple Sentiment Analyzer

Let's use what we learned to create a very basic sentiment analyzer!

Define Positive and Negative Words


positive_words = ["good", "great", "awesome", "fantastic",
"love", "happy"]
negative_words = ["bad", "terrible", "awful", "hate",
"horrible", "sad"]

Write the Analyzer


def simple_sentiment_analyzer(text):
tokens = word_tokenize(text.lower())
pos_count = sum(1 for word in tokens if word in
positive_words)
neg_count = sum(1 for word in tokens if word in
negative_words)

if pos_count > neg_count:


return "Positive Sentiment"
elif neg_count > pos_count:
return "Negative Sentiment"
else:

Copyright © 2025 Skill Foundry


return "Neutral Sentiment"

Test It
text1 = "I love Python, it's awesome!"
text2 = "I hate bugs, they are terrible!"

print(simple_sentiment_analyzer(text1)) # Positive Sentiment


print(simple_sentiment_analyzer(text2)) # Negative Sentiment

Important: Real Sentiment Analysis is More Complex


Our analyzer is simple for learning purposes.

Real-world models use:

 Machine Learning (e.g., Logistic Regression, SVM).

 Deep Learning (e.g., LSTM, Transformers).

 Huge datasets for training.

We will touch on machine learning-based NLP soon!

Summary
 NLP allows machines to interact with human language.

 You learned basic operations: Tokenization, Stopword Removal,


Stemming.

 You built a basic sentiment analyzer using simple word lists.

Copyright © 2025 Skill Foundry


Spam Detection Using Machine Learning

What is Spam Detection?

Spam detection is one of the most practical and common applications of machine
learning today.
It helps automatically identify unwanted or harmful emails and messages.

Every time you see your "Spam" folder filled with junk emails — that's AI at work!

In this chapter, you’ll learn how to build a basic spam detector using Python and
machine learning.

How Spam Detection Works

Spam detection is typically treated as a binary classification problem:

 Spam (1) or

 Not Spam (0)

We train a machine learning model to classify incoming text (like an email) into these
two categories based on features like:

 Words present
 Frequency of words

 Email metadata

Key Steps in Building a Spam Classifier

1. Data Collection
Gather labeled datasets of spam and non-spam emails/messages.
Example: The classic SMS Spam Collection dataset.

2. Text Preprocessing
Clean and transform the text data:

o Lowercasing

o Removing punctuation

o Removing stopwords (common useless words like "the", "and", "is")

o Tokenization (splitting text into words)

Copyright © 2025 Skill Foundry


3. Feature Extraction
Convert the text into numbers that a machine learning model can understand.
Common technique: TF-IDF (Term Frequency - Inverse Document
Frequency).
4. Model Training
Train a classification algorithm, such as:

o Logistic Regression
o Naive Bayes

o Support Vector Machine (SVM)

5. Model Evaluation
Test the model’s performance on new unseen data using metrics like:

o Accuracy

o Precision

o Recall

o F1 Score

Example Code: Building a Basic Spam Detector


# Install required libraries
# pip install scikit-learn pandas

import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.feature_extraction.text import TfidfVectorizer
from sklearn.naive_bayes import MultinomialNB
from sklearn.metrics import accuracy_score

# Load dataset
data = pd.read_csv('spam.csv', encoding='latin-1')
data = data[['v1', 'v2']]
data.columns = ['label', 'text']

# Encode labels
data['label_num'] = data.label.map({'ham': 0, 'spam': 1})

# Split data
X_train, X_test, y_train, y_test = train_test_split(
data['text'], data['label_num'], test_size=0.2,
random_state=42)

# Feature extraction
vectorizer = TfidfVectorizer()

Copyright © 2025 Skill Foundry


X_train_vec = vectorizer.fit_transform(X_train)
X_test_vec = vectorizer.transform(X_test)

# Train model
model = MultinomialNB()
model.fit(X_train_vec, y_train)

# Predict and evaluate


y_pred = model.predict(X_test_vec)
print(f"Accuracy: {accuracy_score(y_test, y_pred)}")

Summary

 Spam detection is a real-world example of binary classification.

 Text preprocessing and feature extraction are key steps.

 Naive Bayes is a strong and simple model for spam detection tasks.

Copyright © 2025 Skill Foundry


Deep Learning for Text Classification (with NLP)
Why Use Deep Learning for Text?

Traditional machine learning methods (like Naive Bayes) work well for small
datasets.
However, for larger, more complex text data, Deep Learning models like RNNs
(Recurrent Neural Networks) and LSTMs (Long Short-Term Memory networks)
perform much better.

Deep learning models can capture:

 The sequence of words

 Contextual meaning behind the text


 Long-range dependencies (words that are connected even if far apart)

Important Deep Learning Architectures for Text

 RNN (Recurrent Neural Network)


Designed to handle sequential data. It processes one word at a time and
keeps track of past words.

 LSTM (Long Short-Term Memory)


An improved version of RNN that solves the "short-term memory" problem.
LSTMs are great at remembering information for long periods.

 GRU (Gated Recurrent Unit)


A simpler and faster variant of LSTM, used in some cases.

Text Classification Pipeline with Deep Learning

1. Prepare the Text Data


o Clean text (lowercase, remove special characters)

o Tokenize (convert text to sequences of numbers)

o Pad sequences to ensure equal length inputs

2. Build the Deep Learning Model

o Use an Embedding Layer to learn word representations

o Add LSTM or GRU layers


o Add Dense (fully connected) layers for output
3. Train and Evaluate the Model

Copyright © 2025 Skill Foundry


o Use appropriate loss functions (like binary cross-entropy)

o Evaluate using accuracy, precision, recall, etc.

Example Code: Text Classification using LSTM


# Install required libraries
# pip install tensorflow keras

import numpy as np
import pandas as pd
from tensorflow.keras.preprocessing.text import Tokenizer
from tensorflow.keras.preprocessing.sequence import
pad_sequences
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Embedding, LSTM, Dense

# Load and preprocess dataset


data = pd.read_csv('spam.csv', encoding='latin-1')
data = data[['v1', 'v2']]
data.columns = ['label', 'text']

# Encode labels
data['label_num'] = data.label.map({'ham': 0, 'spam': 1})

# Prepare text data


tokenizer = Tokenizer()
tokenizer.fit_on_texts(data['text'])
sequences = tokenizer.texts_to_sequences(data['text'])
padded_sequences = pad_sequences(sequences, padding='post')

# Split data
X = padded_sequences
y = data['label_num'].values

from sklearn.model_selection import train_test_split


X_train, X_test, y_train, y_test = train_test_split(X, y,
test_size=0.2, random_state=42)

# Build LSTM model


model = Sequential([
Embedding(input_dim=len(tokenizer.word_index) + 1,
output_dim=64),
LSTM(64),
Dense(1, activation='sigmoid')
])

model.compile(loss='binary_crossentropy', optimizer='adam',
metrics=['accuracy'])

Copyright © 2025 Skill Foundry


model.fit(X_train, y_train, epochs=5, validation_data=(X_test,
y_test))

# Evaluate model
loss, accuracy = model.evaluate(X_test, y_test)
print(f"Test Accuracy: {accuracy}")

Key Points to Remember

 Always preprocess text properly (tokenization, padding).

 Use an Embedding layer before feeding text into LSTM/GRU.

 Tune the number of LSTM units, batch size, and epochs for better results.

 Deep learning models usually require more data and computation power.

Real-World Applications

Deep Learning for text classification powers:

 Spam filters

 Sentiment analysis

 Product reviews categorization

 Fake news detection

 Customer support ticket sorting

Copyright © 2025 Skill Foundry


Computer Vision Basics and Image Classification

What is Computer Vision?

Computer Vision (CV) is a field of Artificial Intelligence (AI) that enables machines
to see, understand, and analyze images or videos — just like humans.

The goal of Computer Vision is to train computers to extract meaningful information


from visual data, so they can perform tasks like:

 Recognizing faces

 Detecting objects

 Understanding scenes

 Reading handwritten text

 Driving autonomous cars


In simple words:
👉 Computer Vision = Teaching computers how to “see” and “think” about
images.

Why is Computer Vision Important?

Computer Vision powers some of the most exciting innovations in today's world:

 Face ID unlocking your smartphone

 Self-driving cars understanding their surroundings

 Medical imaging systems diagnosing diseases

 Retail checkout-free stores recognizing items automatically

 Security cameras detecting suspicious activity

It has become a core technology for industries like healthcare, automotive, finance,
agriculture, and entertainment.

How Computers See Images

Humans see images as a complete picture, but computers see images as arrays of
numbers.
For example, a grayscale image can be represented as a matrix where:
 Each pixel value ranges from 0 (black) to 255 (white).

Copyright © 2025 Skill Foundry


For colored images (RGB):

 3 matrices are used — Red, Green, and Blue channels.

Example: An image of size 100x100 pixels will be represented as:

 For grayscale: 100x100 matrix


 For RGB color: 100x100x3 matrix

These numerical values allow computers to apply mathematical operations to


understand patterns inside images.

Basic Tasks in Computer Vision

Task Description Example

Image Identify the main object in an


"This is a dog."
Classification image.

Detect multiple objects and their


Object Detection "There is a cat and a dog."
locations.

Image Label each pixel according to the "This pixel belongs to a cat,
Segmentation object it belongs to. this one to a dog."

Face
Identify who is in the image. "This is John Doe."
Recognition

"This is the arm, this is the


Pose Estimation Detect human body keypoints.
leg."

In this chapter, we will focus on Image Classification, the foundation of Computer


Vision.

What is Image Classification?

Image Classification means teaching a computer to categorize images into


predefined classes.

For example:

 Given an image of an animal, the model should say whether it is a dog, cat,
or horse.

The basic steps are:


1. Input: An image.

Copyright © 2025 Skill Foundry


2. Output: A predicted label (class) from a set of known categories.

How to Build an Image Classifier

Step 1: Prepare the Dataset


 Collect labeled images for each category.

 Example dataset: CIFAR-10 (contains 60,000 images of 10 classes like


airplane, car, bird, etc.)
Step 2: Preprocess the Images

 Resize images to the same dimension.

 Normalize pixel values (0–255 → 0–1) for faster training.

Step 3: Build the Model

 Use a Deep Learning model, typically Convolutional Neural Networks


(CNNs).

Step 4: Train the Model

 Feed the model with training data.

 Use a loss function and an optimizer to improve model accuracy.


Step 5: Evaluate the Model

 Test the model with unseen data.

 Measure accuracy, precision, recall, etc.

Introduction to Convolutional Neural Networks (CNNs)

CNNs are a special type of neural network designed for processing images.

Key Layers of a CNN:

 Convolution Layer: Extracts features like edges, colors, textures.


 Pooling Layer: Reduces the spatial size to speed up computation.

 Fully Connected Layer: Makes the final decision about the class.

CNNs are highly effective because they automatically learn important features
from images, reducing the need for manual feature engineering.

Example Code: Simple Image Classification using CNN

Copyright © 2025 Skill Foundry


# Install TensorFlow if not already installed
# pip install tensorflow

import tensorflow as tf
from tensorflow.keras import datasets, layers, models
import matplotlib.pyplot as plt

# Load CIFAR-10 dataset


(train_images, train_labels), (test_images, test_labels) =
datasets.cifar10.load_data()

# Normalize pixel values


train_images, test_images = train_images / 255.0, test_images
/ 255.0

# Class names
class_names = ['airplane', 'automobile', 'bird', 'cat',
'deer',
'dog', 'frog', 'horse', 'ship', 'truck']

# Build CNN model


model = models.Sequential([
layers.Conv2D(32, (3, 3), activation='relu',
input_shape=(32, 32, 3)),
layers.MaxPooling2D((2, 2)),

layers.Conv2D(64, (3, 3), activation='relu'),


layers.MaxPooling2D((2, 2)),

layers.Conv2D(64, (3, 3), activation='relu'),


layers.Flatten(),

layers.Dense(64, activation='relu'),
layers.Dense(10)
])

# Compile model
model.compile(optimizer='adam',

loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits
=True),
metrics=['accuracy'])

# Train model
history = model.fit(train_images, train_labels, epochs=10,
validation_data=(test_images,
test_labels))

# Evaluate model
test_loss, test_acc = model.evaluate(test_images, test_labels,
verbose=2)

Copyright © 2025 Skill Foundry


print(f"Test Accuracy: {test_acc}")

Tips for Better Image Classification

 Use Data Augmentation: Randomly flip, rotate, zoom images during training
to make the model generalize better.

 Use Transfer Learning: Use pre-trained models like ResNet, VGG,


MobileNet for faster and better performance.

 Regularization: Techniques like Dropout can help prevent overfitting.

Real-World Applications
 Identifying plant diseases in agriculture

 Sorting packages in warehouses


 Medical diagnosis from X-rays and MRIs

 Self-driving cars understanding road signs

 Filtering inappropriate content on social media

Copyright © 2025 Skill Foundry


AI for Automation: Files, Web, and Emails

What is Automation?

Automation means making computers perform repetitive, rule-based tasks without


human intervention.
When combined with Artificial Intelligence (AI), automation becomes even more
powerful — because now systems can not only perform tasks but also make smart
decisions.

In simple words:
👉 AI + Automation = Machines that can act and think for us.

Why is AI Automation Important?


Every industry benefits from AI automation:

 Business: Sending marketing emails automatically.

 Finance: Auto-generating reports.

 Healthcare: Scheduling patient appointments.

 Education: Auto-grading student assignments.

 Daily Life: Sorting your files, replying to emails, managing social media.

AI helps automate tasks faster, more accurately, and 24x7 without fatigue!

Where Can We Apply AI for Automation?

Area Example

File Management Auto-sorting files into folders

Web Scraping Collecting data from websites

Email Handling Auto-replying or sorting emails

Scheduling Auto-setting calendar events

Customer Support Chatbots replying to customers

Data Entry Filling forms or sheets automatically

Copyright © 2025 Skill Foundry


Tools and Libraries for AI Automation

Tool/Library Purpose

Python Core programming language

Pandas Handling data and spreadsheets

Selenium Browser automation

BeautifulSoup Web scraping

smtplib Sending emails

email (module) Building email messages

os and shutil File and folder management

PyAutoGUI GUI automation (clicks, typing, moving mouse)

✅ All of these are easy-to-use and powerful — perfect for our beginner-to-
advanced automation journey!

Automation Examples with Python

Let's now dive into real-world examples.

📁 1. Automating File Management

Problem:
Suppose your "Downloads" folder is full of random files: PDFs, images, videos, ZIP
files.
You want them sorted into separate folders automatically.

Solution:
Use Python's os and shutil modules.

import os
import shutil

# Path to your downloads folder


downloads_folder = '/path/to/your/Downloads'

Copyright © 2025 Skill Foundry


# List all files
files = os.listdir(downloads_folder)

for file in files:


if file.endswith('.pdf'):
shutil.move(os.path.join(downloads_folder, file),
'/path/to/Downloads/PDFs')
elif file.endswith('.jpg') or file.endswith('.png'):
shutil.move(os.path.join(downloads_folder, file),
'/path/to/Downloads/Images')
elif file.endswith('.zip'):
shutil.move(os.path.join(downloads_folder, file),
'/path/to/Downloads/ZIPs')

print("Files have been organized!")

✅ Result: All your files neatly sorted into folders!

🌐 2. Automating Web Tasks (Web Scraping)

Problem:
You want to collect latest news headlines from a website.

Solution:
Use requests and BeautifulSoup libraries.
import requests
from bs4 import BeautifulSoup

url = 'https://news.ycombinator.com/'
response = requests.get(url)

soup = BeautifulSoup(response.text, 'html.parser')


titles = soup.find_all('a', class_='storylink')

for idx, title in enumerate(titles[:10], 1):


print(f"{idx}. {title.text}")

✅ Result: Prints the top 10 news headlines automatically!

📧 3. Sending Automated Emails

Problem:
You want to automatically send a "Good Morning" email to your team every day.
Solution:
Use smtplib and email modules.
import smtplib

Copyright © 2025 Skill Foundry


from email.mime.text import MIMEText

# Email details
sender_email = "[email protected]"
receiver_email = "[email protected]"
password = "your-email-password"

# Create email content


message = MIMEText("Good Morning Team! Have a great day
ahead.")
message['Subject'] = "Daily Greetings"
message['From'] = sender_email
message['To'] = receiver_email

# Send the email


with smtplib.SMTP('smtp.gmail.com', 587) as server:
server.starttls()
server.login(sender_email, password)
server.send_message(message)

print("Email sent successfully!")

✅ Result: Email is sent automatically without opening Gmail manually!

AI Enhancements to Automation

Adding AI techniques can make automation smarter.

 Smart file sorting: Automatically detect if a file is a resume, an invoice, or a


report using text classification.
 Smart web scraping: Detect and scrape dynamic content using machine
learning.

 Smart emails: Auto-classify emails as important, spam, promotions using


Natural Language Processing (NLP).

We'll explore more of these advanced techniques in upcoming chapters!

Best Practices for AI Automation

 Always test automations on sample data first.

 Handle errors (like missing files or internet issues) properly.


 Keep sensitive information like email passwords secret (use environment
variables).
 Respect website rules when scraping (use polite scraping, follow robots.txt).

Copyright © 2025 Skill Foundry


 Schedule your scripts to run automatically using cron jobs (Linux) or Task
Scheduler (Windows).

Real-World Applications of AI Automation

 Netflix: Auto-suggesting movies based on your viewing history.

 Amazon: Auto-sorting orders and inventory updates.

 Banks: Auto-detecting suspicious transactions.


 Healthcare: Auto-updating patient records.

 Social Media: Auto-moderating comments and posts.

Automation saves billions of hours every year!

Copyright © 2025 Skill Foundry


AI Chatbots and Virtual Assistants

Introduction to Chatbots and Virtual Assistants

A Chatbot is a computer program that simulates human conversation.


A Virtual Assistant is an advanced chatbot that can perform tasks based on voice
or text commands.

You have already seen many examples:

 Siri (Apple)

 Alexa (Amazon)

 Google Assistant

 ChatGPT 😉

They can understand your questions and respond intelligently — thanks to Natural
Language Processing (NLP) and Machine Learning (ML).

Why Are Chatbots Important?

Chatbots help:

 📞 Customer Service: 24x7 instant replies without human agents.

 🛍️ Shopping Assistance: Guiding users to products.

 📅 Scheduling Meetings: Booking appointments automatically.

 📚 Education: Answering student queries instantly.

 🌐 Website Support: Assisting visitors in real time.

In short:
Chatbots = Instant, intelligent service = Happy users.

How Do Chatbots Work?

Step Description

Input User sends a message (text or voice).

Understanding AI/NLP model interprets the intent behind the message.

Copyright © 2025 Skill Foundry


Step Description

Based on intent, the bot decides what to reply or what action to


Processing
perform.

Response Bot sends a meaningful answer back to the user.

✅ Simple, but extremely powerful when done correctly!

Types of Chatbots

Type Example

Rule-Based Answer based on keywords ("If user says 'hello', reply 'Hi
Chatbots there!'")

Understands meaning, context, and replies smartly (like


AI-Based Chatbots
ChatGPT)

We will build a simple Rule-Based Chatbot first, and later explore ideas for AI-
enhanced ones.

Building a Simple Chatbot in Python

Let’s create a basic chatbot in Python — no complicated setup needed!


# Simple Rule-Based Chatbot

def chatbot_response(user_input):
user_input = user_input.lower()

if 'hello' in user_input or 'hi' in user_input:


return "Hello! How can I assist you today?"
elif 'your name' in user_input:
return "I'm PyBot, your personal assistant."
elif 'weather' in user_input:
return "I'm not connected to real weather data yet,
but it's always sunny in Python land!"
elif 'bye' in user_input:
return "Goodbye! Have a great day!"
else:
return "I'm sorry, I didn't understand that. Could you
rephrase?"

# Chat loop
while True:
user_input = input("You: ")

Copyright © 2025 Skill Foundry


if user_input.lower() == 'exit':
print("Bot: Goodbye!")
break
response = chatbot_response(user_input)
print("Bot:", response)

✅ Result:
You now have your own chatbot running inside the terminal!

How to Make Chatbots Smarter

Instead of using simple keywords, you can enhance the chatbot by:

 Natural Language Processing (NLP):


o Tokenize sentences

o Recognize intents

o Extract key entities (names, dates, places)

 Machine Learning Models:

o Train the bot on real conversations

o Predict the best response

 APIs:

o Connect to services like weather, news, jokes, translation APIs for


dynamic replies.

Example:
You say “What’s the weather today?” and the bot fetches live data from a weather
API! 🌤️

Libraries and Tools for Building AI Chatbots

Library Use

NLTK Natural Language Processing

spaCy Advanced NLP tasks

Transformers (HuggingFace) Pre-trained AI models like BERT, GPT

TensorFlow / PyTorch Deep Learning models

Copyright © 2025 Skill Foundry


Library Use

Flask / FastAPI Hosting chatbot as a web app

✅ Many are easy to learn and widely used in industry!

How to Build a Voice-Based Virtual Assistant

You can even make your bot talk and listen using:

 speech_recognition (Python library to capture voice)


 pyttsx3 (Text to speech engine)

 gTTS (Google Text-to-Speech)

Example to make Python speak:


import pyttsx3

engine = pyttsx3.init()
engine.say("Hello! I am your AI assistant.")
engine.runAndWait()

✅ Result: Your computer literally talks to you!

AI Chatbots in Real Life

Company Usage

Domino’s Pizza Chatbot for ordering pizza

HDFC Bank Eva chatbot for customer service

Google Duplex AI for phone call bookings

Spotify Chatbot for music recommendations

These real-world bots are built using exactly the principles you’re learning now!

Final Thoughts

Today, every company is either using or planning to use chatbots and virtual
assistants.

By learning how to build them with Python + AI, you are preparing yourself for:

 High-paying jobs 💸

Copyright © 2025 Skill Foundry


 Freelance projects 🎯

 Even launching your own SaaS startup 🚀

Remember:
👉 AI Chatbots are the future of communication.

Copyright © 2025 Skill Foundry

You might also like