0% found this document useful (0 votes)
27 views2 pages

CNN Facial Expression MATLAB Code

The document outlines a process for training a convolutional neural network (CNN) using the FER2013 dataset for emotion recognition. It includes steps for loading the dataset, splitting it into training and testing sets, defining the CNN architecture, setting training options, and evaluating the model's accuracy. The final output reports the test accuracy of the trained network.

Uploaded by

snagel1974
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views2 pages

CNN Facial Expression MATLAB Code

The document outlines a process for training a convolutional neural network (CNN) using the FER2013 dataset for emotion recognition. It includes steps for loading the dataset, splitting it into training and testing sets, defining the CNN architecture, setting training options, and evaluating the model's accuracy. The final output reports the test accuracy of the trained network.

Uploaded by

snagel1974
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

% Load dataset (Assuming FER2013 preprocessed and saved as imageDatastore)

imds = imageDatastore('fer2013_images', ...


'IncludeSubfolders', true, ...
'LabelSource', 'foldernames');

% Split into training and testing sets


[imdsTrain, imdsTest] = splitEachLabel(imds, 0.8, 'randomized');

% Image input size


inputSize = [48 48 1];

% Define CNN architecture


layers = [
imageInputLayer(inputSize, 'Name', 'input')

convolution2dLayer(1, 32, 'Padding', 'same', 'Name', 'conv1')


reluLayer('Name', 'relu1')

convolution2dLayer(5, 32, 'Padding', 'valid', 'Name', 'conv2')


reluLayer('Name', 'relu2')

maxPooling2dLayer(2, 'Stride', 2, 'Name', 'pool1')

convolution2dLayer(3, 64, 'Padding', 'valid', 'Name', 'conv3')


reluLayer('Name', 'relu3')

maxPooling2dLayer(2, 'Stride', 2, 'Name', 'pool2')

convolution2dLayer(3, 128, 'Padding', 'same', 'Name', 'conv4')


reluLayer('Name', 'relu4')

maxPooling2dLayer(2, 'Stride', 2, 'Name', 'pool3')

fullyConnectedLayer(2048, 'Name', 'fc1')


dropoutLayer(0.6, 'Name', 'dropout1')

fullyConnectedLayer(1024, 'Name', 'fc2')


dropoutLayer(0.4, 'Name', 'dropout2')

fullyConnectedLayer(7, 'Name', 'fc_output') % 7 classes in FER2013


softmaxLayer('Name', 'softmax')
classificationLayer('Name', 'output')
];

% Set training options


options = trainingOptions('sgdm', ...
'InitialLearnRate', 0.01, ...
'MaxEpochs', 30, ...
'MiniBatchSize', 64, ...
'Shuffle', 'every-epoch', ...
'ValidationData', imdsTest, ...
'ValidationFrequency', 30, ...
'Verbose', true, ...
'Plots', 'training-progress');
% Train the network
net = trainNetwork(imdsTrain, layers, options);

% Evaluate model
YPred = classify(net, imdsTest);
YTest = [Link];
accuracy = sum(YPred == YTest)/numel(YTest);
fprintf('Test Accuracy: %.2f%%\n', accuracy * 100);

You might also like