0% found this document useful (0 votes)
11 views11 pages

Deep Learning Python 1741606882

The document provides an overview of deep learning concepts in Python, including neural network architecture, activation functions, loss functions, and optimizers. It includes code examples for building and training feedforward, convolutional, and recurrent neural networks using TensorFlow and Keras, as well as techniques for preventing overfitting and optimizing model performance. Additionally, it covers saving and loading models for future use.

Uploaded by

Leandro Massai
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views11 pages

Deep Learning Python 1741606882

The document provides an overview of deep learning concepts in Python, including neural network architecture, activation functions, loss functions, and optimizers. It includes code examples for building and training feedforward, convolutional, and recurrent neural networks using TensorFlow and Keras, as well as techniques for preventing overfitting and optimizing model performance. Additionally, it covers saving and loading models for future use.

Uploaded by

Leandro Massai
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Deep Learning in

Python
Neural Networks –
Composed of input,
hidden, and output
layers
Activation Functions –
Sigmoid, ReLU, Tanh,
Softmax
Loss Functions – MSE,
Cross-Entropy
Optimizers – SGD, Adam,
RMSprop
Feedforward Neural Network (FNN)

import tensorflow as tf
from [Link] import
Sequential
from [Link] import Dense

model = Sequential([
Dense(64, activation='relu', input_shape=
(10,)),
Dense(32, activation='relu'),
Dense(1, activation='sigmoid')
])

[Link](optimizer='adam',
loss='binary_crossentropy', metrics=
['accuracy'])
[Link]()
ReLU (Rectified Linear Unit) –
max(0, x)
Sigmoid – Converts to
probability (0 to 1)
Tanh – Scales between -1 and 1
Softmax – Used in multi-class
classification

import [Link]
as K
def relu(x):
return [Link](0, x)
def sigmoid(x):
return 1 / (1 + [Link](-x))
Train & Evaluate a Neural Network

history = [Link](X_train,
y_train, epochs=50,
batch_size=32, validation_data=
(X_test, y_test))
loss, accuracy =
[Link](X_test, y_test)
print(f"Test Accuracy: {accuracy
* 100:.2f}%")
Best for Image Classification &
Object Detection

from [Link] import Conv2D,


MaxPooling2D, Flatten

cnn_model = Sequential([
Conv2D(32, (3,3), activation='relu', input_shape=
(128,128,3)),
MaxPooling2D(2,2),
Conv2D(64, (3,3), activation='relu'),
MaxPooling2D(2,2),
Flatten(),
Dense(128, activation='relu'),
Dense(10, activation='softmax')
])

cnn_model.compile(optimizer='adam',
loss='categorical_crossentropy', metrics=['accuracy'])
Best for Sequential Data like
Time Series & NLP

from [Link] import


SimpleRNN, LSTM

rnn_model = Sequential([
LSTM(50, return_sequences=True,
input_shape=(100, 1)),
LSTM(50),
Dense(1)
])

rnn_model.compile(optimizer='adam',
loss='mse')
Best for NLP & Vision Tasks

from transformers import


TFAutoModel

bert_model =
TFAutoModel.from_pretraine
d("bert-base-uncased")
Prevent Overfitting – Dropout,
L1/L2 Regularization

from
[Link]
import Dropout

[Link](Dropout(0.3))
# Dropout 30%
Optimize Model Performance

from keras_tuner import


RandomSearch

tuner = RandomSearch(model,
objective='val_accuracy',
max_trials=5)
[Link](X_train, y_train,
epochs=10, validation_data=
(X_test, y_test))
Save & Load a Model

[Link]("my_model.h5"
)

loaded_model =
[Link].load_mod
el("my_model.h5")

You might also like